Ice Alive: Grants!

Ice Alive has a life of it’s own – no longer just a film, now an organization that exists to promote emerging scientists and communicators working on Earth’s changing ice and snow. Our website (icealive.org) is almost ready to launch, and we have just announced our inaugural Ice Alive grant scheme!

ICEALIVE

The grant will support 2-4 individuals or teams that have a novel idea for communicating cryospheric science on the broad theme of “Ice Alive”. We hope to see applications from artists, performers, musicians, writers, educators, journalists, scientists – anyone who has a great idea for spreading cryospheric science to new audiences in exciting ways.

All the details are HERE – please spread the word and/or apply yourself before 31st July 2018.

Advertisements

Sci-comm on Monocle24

The importance of communicating science, the value of art-science collaborations and the vulnerability of the Greenland Ice Sheet are the main topics of my radio interview at Monocle24, available to download HERE 

 

Screenshot from 2018-05-01 14-40-36

Bio-co-albedo?

At EGU I had the pleasure of talking about BioSNICAR and biological albedo reduction with two of the big-names in albedo research. A very interesting point they raised was that the term ‘bioalbedo’ does not precisely describe the concept that it is attached to. This is true. The term bioalbedo was not coined by spectroscopy or remote sensing experts, but by microbiologists and glaciologists, and is now well-baked into the literature. I will outline here the reasons why we should be cautious of this terminology.

Albedo is the survival probability of a photon entering a medium. Light incident upon a material partly reflects from the upper surface, the remainder enters the medium and can scatter anywhere there is a change in the refractive index (e.g. a boundary between air and ice, or ice and water, etc). Where there are opportunities for scattering, light bounces around in the medium, sometimes preferentially in a certain direction depending upon the optical properties of the medium (ice is forward-scattering) but always changing direction to some extent each time it scatters, until it is either absorbed or it escapes back out of the medium travelling in a skywards direction. The albedo of the material is the likelihood that the down-welling light entering the medium exits again later as up-welling light. The more strongly absorbing the material, the more likely the light is to be absorbed before exiting. Ice is very weakly absorbing in blue wavelengths (~400 nm), becoming generally more strongly absorbing at longer wavelengths into the near infra-red (hence ice often appearing blue). Solar energy is mostly concentrated within the wavelength range 300 – 5000 nm and the term albedo concerns the survival probability of all photons with wavelengths within this range either at a particular wavelength (spectral albedo) or integrated over the entire solar spectrum (broadband albedo).

This means that a photon entering a material with a broadband albedo of 0.8 has an 80% chance of exiting again. Therefore, when a material is bombarded with billions of photons, 80% of them are returned skywards and 20% are absorbed, and the surface appears bright. A lower albedo therefore means less likelihood of photon survival.

For a single material, its absorbing and scattering efficiencies are described using the scattering and absorption coefficients. The ratio of these two coefficients is known as the single scattering albedo (SSA), which is a crucial term for radiative transfer. A higher SSA is associated with a greater likelihood of a particle scattering a photon rather than absorbing it. a particle with SSA = 1 is non-absorbing.

Therefore, with these definitions we can see why the term bio-albedo is not semantically perfect. The term bio-albedo implies that the relevant measurement is the light reflected from biological cells, which is really the inverse of the measurement of interest. Algal cells are strongly absorbing and their effect on snow and ice albedo is to increase the likelihood of a photon being absorbed rather than scattered back out of the medium. For this reason, the better term to use would be bio-co-albedo, where co-albedo describes the fraction of incident energy absorbed by the particles (i.e. 1-SSA).

Bio-co-albedo is more technically correct terminology, but it is also quite a subtle distinction, and arguably if we have calculated the single scattering albedo, we have by default calculated the co-albedo (co-albedo = 1 – single scattering albedo), and the outcome is the same. The meaning of the term ‘bio-co-albedo’ is not obvious to those outside of spectroscopy and remote sensing communities, which i think is a major issue since the topic is so broadly interdisciplinary. The more aesthetic and simpler ‘bio-albedo’ is justified in most cases, especially because it is already well-used in the literature and more widely accessible. From a utilitarian perspective, bio-albedo wins out.

As an aside, it reminds me that I have often wondered whether ‘evolution’ is really an acceptable word for cryosphere scientists to use to describe the temporal development of – for example – a snowpack or ice surface. Evolution implies changes resulting from inherited characteristics passed through successive generations plus random mutations that are selected for or against based on goodness of fit for the specific environment. A melting snowpack cannot ‘evolve’ as there are no ancestors, no selection, no inheritance, no generations. People also age over time, influenced by external factors, but we do not describe individuals as evolving – same applies to a snowpack or glacier. Overall, I suspect splitting hairs over terms like bio-co-albedo does more to dissuade non-specialists from joining the conversation than it does to improving understanding of the processes involved.

ASD spectra processing with Linux & Python

I’m sharing my workflow for processing and analysing spectra obtained using the ASD Field Spec Pro, partly as a resource and partly to see whether others have refinements or suggestions for improving the protocols. I’m specifically using Python rather than any proprietary software to keep it all open source, transparent and to keep control over every stage of the processing..

Working with .asd files

By default the files are saved as a filetype with the extension .asd which can be read by the ASD software ‘ViewSpec’. The software does allow the user to export the files as ascii using the “export as ascii” option in the dropdown menus. My procedure is to use this option to resave the files as .asd.txt. I usually keep the metadata by selecting the header and footer options; however I deselect the option to output the x-axis because it is common to all the files and easier to add once later on. I choose to delimit th data using a comma to enable the use of Pandas ‘read_csv’ function later.

To process and analyse the files I generally use the Pandas package in Python 3. To read the files into Pandas I first rename the files using a batch rename command in the Linux terminal:

cd /path/folder/

rename “s/.asd.txt/.txt/g”**-v

Then I open a Python editor – my preference is to use the Spyder IDE that comes as standard with an Anaconda distribution. The pandas read_csv function can then be used to read the .txt files into a dataframe. Put this in a loop to add all the files as separate columns in the dataframe…

import pandas as pd

import os

spectra = pd.DataFrame()

filelist = os.listdir(path/folder/)

for file in filelist:

spectra[file] = pd.read_csv(‘/path/folder/filename’, header=None, skiprows=0)

If you chose to add any header information to the file exported from ViewSpec, you can ignore it by skipping the appropriate number of rows in the read_csv keyword argument ‘skiprows’.

Usually each acquisition comprises numerous individual replicate spectra. I usually have 20 replicates as a minimum and then average them for each sample site. Each individual replicate has its own filename with a dequentially increasing number (site1…00001, site1….00002, site1…00003 etc). My way of averaging these is to cut the extension and ID number from the end of the filenames, so that the replicates from each sample site are identically named. Then the pandas function ‘groupby’ can be used to identify all the columns with equal names and replace them with a single column containing the mean of all the replicates.

filenames = []

for file in filelist:

file = str(file)

file = file[:-10]

filenames.append(file)

#rename dataframe columns according to filenames

filenames = np.transpose(filenames)

DF.columns = [filenames]

# Average spectra from each site

DF2 = DF.transpose()

DF2 = DF2.groupby(by=DF2.index, axis=0).apply(lambda g: g.mean() if isinstance(g.iloc[0,0],numbers.Number) else g.iloc[0])

DF = DF2.transpose()

Then I plot the dataset to check for any errors or anomalies, and then save the dataframe as one master file organised by sample location

spectra.plot(figsize=(15,15)),plt.ylim(0,1.2)

spectra.to_csv(‘/media/joe/FDB2-2F9B/2016_end_season_HCRF.csv’)

Common issues and workarounds…

Accidentally misnamed files

During a long field season I sometimes forget to change the date in the ASD software for the first few acquisitions and then realise I have a few hundred files to rename to reflect the actual date. This is a total pain, so here is a Linux terminal command to batch rename the ASD files to correct the data at the beginning of the filename.

e.g. to rename all files in folder from 24_7_2016 accidentally saved with the previous day’s date, run the following command…

cd /path/folder/

rename “s/23_7/24_7/g” ** -v

Interpolating over noisy data and artefacts

On ice and snow there are known wavelengths that are particularly susceptible to noise due to water vapour absorption (e.g. near 1800 nm) and there may also be noise at the upper and lower extremes of the spectra range measured by the spectrometer. Also, where a randomising filter has not been used to collect spectra, there can be a step feature present in the data at the crossover point between the internal arrays of the spectrometer (especially 1000 nm). This is due to the spatial arrangement of fibres inside the fibre optic bundle. Each fibre has specific wavelengths that it measures, meaning if the surface is not uniform certain wavelengths are over sampled and others undersampled for different areas of the ice surface. The step feature is usually corrected by raising the NIR (>1000) section to meet the VIS section (see Painter, 2011). The noise in the spectrum is usually removed and replaced with interpolated values. I do this in Pandas using the following code…

for i in DF.columns:

# calculate correction factor (raises NIR to meet VIS – see Painter 2011)

corr = DF.loc[650,i] – DF.loc[649,i]

DF.loc[650:2149,i] = DF.loc[650:2149,i]-corr

# interpolate over instabilities at ~1800 nm

DF.loc[1400:1650,i] = np.nan

DF[i] = DF[i].interpolate()

DF.loc[1400:1600,i] = DF.loc[1400:1600,i].rolling(window=50,center=False).mean()

DF[i] = DF[i].interpolate()

The script is here for anyone interested… https://github.com/jmcook1186/SpectraProcessing

CASPA at EGU 2018

The EGU annual meeting in Vienna is one of the major events in the earth science calendar, where the latest ideas are aired and discussed and new collaborations forged. My talk this year was in the “Remote Sensing of the Cryosphere” session. Here’s an overview:

Albedo is a primary driver of snow melt. For clean snow and snow with black carbon, radiative transfer models to an excellent job of simulating albedo, yet there remain aspects of snow albedo that are poorly understood. In particular current models do not take into account algal cells that grow and dramatically discolour ice in some places (except our 1-D BioSNICAR model) and few take into account changes in albedo over space and time.

This led me to wonder about using cellular automata as a mechanism for distributing albedo modelling using radiative transfer over three spatial dimensions and time, and also enabling a degree of stochasticity to be introduced to the modelling (which is certainly present in natural systems).

Cellular automata are models built on a grid composed of individual cells. These individual cells update as the model progresses through time according to some function – usually a function of the values of the neighbouring cells. Cellular automata have been used extensively to study biological and physical systems in the past – for examples Conway’s Game of Life, Lovelock’s DaisyWorld and Bak’s Sandpile Model not only gave insight into particular processes, but arguably changed the way we think about nature at the most fundamental level. Those three models were epoch-changing for the concepts of complexity and chaos theory.

gof
An implementation of Conway’s Game of Life, showing the grid updating in a complex fashion, driven by simple rules, by Jakub Konka

For the snowpack, I developed a model I am calling CASPA -an acronym for Cellular Automaton for SnowPack Albedo. CASPA draws on a cellular automaton approach with a degree of stochasticity to predict changes in snowpack biophysical properties over time

At each timestep the model updates the biomass of each cell. This happens according to a growth model (an initial inoculum doubles in biomass). This biomass has a user-defined probability of growing in situ (darkening that cell) or spreading to a randomly selected adjacent cell. Once this has occurred, the radiative transfer model BioSNICAR is called and used to predict the albedo, and the energy absorbed per vertical layer. The subsurface light field is visualised as the planar intensity per vertical layer, per cell. The energy absorbed per layer is also used to define a temperature gradient which is used to drive a grain evolution model. In the grain evolution model, wet and dry grain growth ca occur, along with melting, percolation and refreezing of interstitial water. This is consistent with the grain evolution model in the Community Land Model. The new grain sizes are fed back into SNICAR ready for the albedo calculation at the next timestep.

At the same time, inorganic impurities can be incorporated into the model. These include dust and soot. These can be constant throughout the model run, or can vary according to a user-defined scavenging or deposition rate. They can also melt-out from beneath, by having the inorganic impurities rising up through successive vertical layers per timestep.

CASPA_map
The 2D albedo map output by CASPA showing the albedo decline due to an algal bloom growing on the snowpack

In this way, the albedo of a snowpack can be predicted in three spatial dimensions plus time. Taking the incoming irradiance into account, the radiative forcing can be calculated at each vertical depth at each cell per timestep. Furthermore, the energy available as photosynthetically active radiation in each layer can be quantified. Ultimately. these values can feed back into the growth model. Coupling the CASPA scheme wit a sophisticated ecological model could therefore be quite powerful.

By default the model outputs a 2D albedo map and a plot of biomass against albedo. It is interesting to realise that the subtle probabilistic elements of the cellular model can lead to drastically different outcomes for the biomass and albedo of the snowpack even with identical initial conditions. This is also true of natural systems and the idea that an evolving snowpack can be predicted using a purely deterministic model seems, to me, erroneous. There are interesting observations to make about the spatial ecology of the system. Even this simplified system can runaway into dramatic albedo decline or almost none. It makes me wonder about natural snowpacks and the Greenland dark zone – how much of the interannual variation emerges from internal stochasticity rather than being a deterministic function of meteorology or glaciology?

Fig2B
A plot of albedo and biomass against time for CASPA. Each individual run is presented as a dashed line, the mean of all runs is represented as the solid line. The divergence in evolutionary trajectory between individual runs is astonishing since these were all run with identical initial conditions – a result of emergent complexity and subtle imbalances in the probabilistic functions in the model.

In terms of quantifying biological effects on snow albedo, CASPA can be run with the grain evolution and inorganic impurity scavenging models turned ON or OFF. Comparing the albedo reduction taking into account the physical evolution of the snow with that when the snow physics remain constant provides an estimate of the indirect albedo feedbacks and the direct albedo reduction due to the algal cells.

This modelling approach opens up an interesting opportunity space for remote sensing in the cryosphere. In parallel to this modelling I have been working hard on a supervised classification scheme for identifying various biological and non-biological ice surface types using UAV and satellite remote sensing products. Coupling this scheme with CASPA offers an opportunity to upsample remote sensing imagery in space and time, or to set the initial conditions for CASPA using real aerial data and then experimenting with various future scenarios. At the moment, I lack any UAV data for snow with algal patches to actually implement the workflow, but it is proven using multispectral UAV data from bare ice on the Greenland ice sheet. When I obtain multispectral data for snow with algal blooms, it is possible to automate the entire pipeline from loading the image, classifying it using a supervised classifier, converting it into an n-dimensional array that can be used as an initial state for the CASPA cellular automaton, whose conditions can be tweaked to experiment with various environmental scenarios.

Therefore, the limiting factor for CASPA at the moment is availability of multispectral aerial data and field spectroscopy for training data for algal blooms on snow. In the spirit of open science and to try to stimulate a development community, I have made this code 100% open and annotated despite being currently unpublished, and I’d be delighted to receive some pull requests!

In summary, I suggest coupling radiate transfer with cellular automata and potentially remote sensing imagery is a promising way to push albedo modelling forwards into spatial and temporal variations and an interesting way to build a degree of stochasticity into our albedo forecasting and ecological modelling.

Ice Alive: Uncovering the secrets of Earth’s Ice

In collaboration with Rolex Awards for Enterprise, Proudfoot Media and I have produced a documentary film explaining the latest research into the surprising hidden biology shaping Earth’s ice. The story is told by young UK Arctic scientists with contributions from guests including astronaut Chris Hadfield and biologist Jim Al-Khalili. We went to great lengths to make this a visually striking film that we hope is a pleasure to watch and communicates the otherwordly beauty and incredible complexity of the Arctic glacial landscape. We aim to educate, entertain and inspire others into exploring and protecting this most sensitive part of our planet in their own ways.

We think the film is equally suited to the general public as school and university students, and we are delighted to make this a free-to-all teaching resource. Please watch, share and use!

 

Alongside this film, I also collaborated with musician Hannah Peel on an audiovisual piece designed to communicate the complexity of process occurring on the Greenland Ice Sheet through sound. View the piece (good headphones recommended!) and write up here

Ice Alive: An audiovisual exploration of the Greenland Ice Sheet

 

As an Arctic scientist I am privileged to be able to explore the coldest parts of our planet, making observations and measurements and helping others to understand how these areas function by writing papers and giving talks, lectures and writing for magazines and newspapers. But to truly understand an environment, we must also explore the intangible and immeasurable. To communicate it to diverse audiences, we must use not only facts and observations, but aesthetics and emotion. The piece above is a bridge connecting music and science – an effort to understand and communicate the hidden beauty, complexity and sensitivity of the Greenland Ice Sheet through sound. I hope that projects like this will bring new audiences to Arctic science, using music, art and aesthetics to pique their curiosity.

This project arose from a chance encounter in 2017. I was a guest on Radio 4’s Midweek program, along with musician Hannah Peel. As I listened to her explain her art on air, and later listening to her music, especially the new album, ‘Mary Casio’, I was struck by the depth of thought and analysis underpinning her work. I reached out to see if she would be interested in applying the same process to exploring the changing Arctic.

To my surprise and delight, Hannah agreed to make a new composition. We chatted about Arctic science – ice sheet dynamics, albedo feedbacks and microbiology in particular, and I provided footage and images from our field sites in Greenland and Svalbard. Hannah then went away and composed a piece of music inspired by the intricate processes, nested feedbacks and hidden complexity of this environment. I then cut the music to drone footage I filmed on site in 2017. I am overjoyed with the result, because I think Hannah’s music communicates perfectly the almost paradoxical sense of grandeur and intricacy, power and vulnerability of the ice.

Explore more of Hannah’s amazing music here