Arctic Summer-time Cyclones Field Campaign in Svalbard

Hannah Croad – h.croad@pgr.reading.ac.uk

The rapid decline of sea ice is permitting increased human activity in the summer-time Arctic, where it will be exposed to the risks of Arctic weather. Arctic cyclones are the major weather hazard in the summer-time Arctic, producing strong winds and ocean waves that impact sea ice over large areas. My PhD project is about understanding the dynamics of Arctic summer-time cyclones. One of the biggest uncertainties in our understanding is the interaction of cyclones with the surface and sea ice. Sea ice-atmosphere coupling is greatest in summer when the ice is thinner and more mobile. Strong winds associated with cyclones can move and alter the sea ice, but the sea ice state also feeds back on the development of cyclones, determining surface drag and turbulent fluxes of heat and moisture


My PhD project is closely linked with the Arctic Summer-time Cyclones NERC project, and therefore, I had the opportunity to join the associated field campaign. The field campaign team is comprised of scientists, engineers and pilots from the University of Reading, the University of East Anglia and British Antarctic Survey (BAS). The primary aim of the field campaign was to fly through Arctic cyclones, (i) mapping cyclone structure and (ii) obtaining measurements necessary to characterise the cyclone-sea interaction. In particular, observations of near-surface fluxes of momentum, heat and moisture over sea ice and ocean are needed, as these fluxes dictate the impact of the surface on cyclones. These observations are needed to evaluate and improve the representation of turbulent exchange in numerical weather prediction (NWP) models, especially over sea ice where there are not many existing observations. To obtain accurate measurements of near-surface fluxes, we need to be quite close to the surface (no higher than 300 ft). To do this, we would be using BAS’s Twin Otter aircraft, equipped with Meteorological Airborne Science INstrumentation (MASIN). The twin-engine prop aircraft is small and light, and is therefore ideal for flying at low-levels just above the surface (as low as 50 ft!). There are many instruments fitted on the MASIN research aircraft, but the most important measurements for our purposes were temperature, wind speed, humidity (important for mapping cyclone structure), surface layer turbulent fluxes (from the 50 Hz turbulence probe), and ice surface properties (from laser altimeter).

British Antarctic Survey’s Twin Otter aircraft, fitted with the MASIN equipment. You can see the turbulence probe on the boom at the front of the aircraft, and the CAPS (cloud, aerosol, and precipitation spectrometer) probe on the left wing. The pilot is on top of the aircraft, carrying out final checks before a science flight. Photo from John Methven.

After a 1-year delay due to the Covid-19 pandemic, the field campaign took place in July and August 2022. We were based on the Norwegian archipelago of Svalbard, a 3-hour flight north of Oslo. The team was based in Longyearbyen, the main town on Svalbard. At 78°N, Svalbard is the most northern town in the world! Longyearbyen is located within a valley on the shore of Adventfjorden. The town is a strange but charming place with lots of eccentricities. Longyearbyen is populated with wooden buildings, with pipes above the ground (as the ground freezes in winter), and old mining structures on the sides of the valley. The town is small, but well provided for, with a few tourist shops, restaurants, and a supermarket. As Svalbard is in the Arctic circle, during the summer months it experiences 24-hour sunlight, which was very strange! Furthermore, Longyearbyen is one of the only places on Svalbard that is ‘polar bear safe’ – you should only leave the town limits if you have a rifle!


The field campaign team worked at Longyearbyen airport. The team would study the forecasts from different weather models for the next week, to decide on flight plans. We were primarily looking for strong winds (ideally associated with cyclones, but beggars can’t be choosers!) over the sea ice, within range of the Twin Otter aircraft (approximately 600 nautical miles). With flight planning, there were many things to consider. It was a case of waiting for good weather to come to us, and planning rest days for the pilots when the weather wasn’t looking so interesting in the forecast. Flight plans would consist of transit to and from the target region, where science would be conducted. Science flying included low-level legs to obtain turbulent flux measurements, vertical profiles of the boundary layer, and stacked cross-sections through cyclone features (e.g. fronts) in and above the boundary layer. For flights where low-level flying was planned, it was key that there should not be low cloud in the target area, as this would prevent the aircraft from flying below 1000 ft for safety reasons. It was also important that there were no bad conditions (poor visibility or strong winds) in Longyearbyen, which would prevent the aircraft from taking off or landing. Longyearbyen is an isolated airfield, and the aircraft cannot carry enough fuel to make it back to the mainland if conditions are too poor to land, so this was a very important consideration. Furthermore, the American and French THINICE project field campaign was being conducted at the same time in Svalbard, with the SAFIRE ATR42 aircraft flying at higher levels, looking downwards on Arctic cyclones. We were able to co-ordinate several flights through the same weather systems, with the Twin Otter aircraft flying below the ATR42.


The Twin Otter aircraft holds 3-4 people, including the pilot. With an instrument engineer also on board, this left space for 1 or 2 scientists on each flight (Note: to fly on the aircraft we had to do helicopter underwater escape training – see my previous blog at https://socialmetwork.blog/2021/07/16/helicopter-underwater-escape-training-for-arctic-field-campaign/). The cabin is very small (too small for a person to stand up), and is rather cramped, with a considerable amount of space taken up by the extra range fuel tank! The aircraft is flown between 50 and 10,000 ft, and so the cabin is not pressurized. For low-level flying, the crew must wear immersion suits and life jackets on the aircraft (in the unlikely event that the aircraft must ditch in the ocean). On the flight the crew wear noise-cancelling headphones (as the engines are rather loud), and everyone can speak to each other over the intercom. During the flight the scientists will alter the flight plan if necessary, depending on the conditions they encounter, and take notes of the environment and any notable events that occur during the flight. This includes noting what they can see out of the window (e.g. sea ice fraction, cloud), any interesting observations from the live feed of the instrument output within the aircraft (e.g. boundary layer depth), and any instruments that are not working or faulty.


I had the opportunity to fly on the aircraft on the third science flight of the field campaign (I wrote about this in another blog: https://research.reading.ac.uk/arctic-summertime-cyclones/first-field-campaign-flying-experience/). We were targeting a region to the north-west of Svalbard, in the Fram Strait, where there was forecast to be strong northerly winds over the marginal ice zone. The primary objective was to measure turbulent fluxes over sea ice at low-level. However, on reaching the target region, we were unable to descend lower than 500 ft due to cloud and Arctic sea smoke (formed as cold Arctic air moves over warmer water in between the sea ice floes) at the surface – not safe conditions for flying at low-level! Through gaps in the clouds, we got a glimpse at the Arctic sea smoke over the marginal ice zone (see below). (Note: Several other flights in the field campaign encountered better conditions and were able to get to low levels – see video below!). We searched for better conditions near the target region for an hour, but didn’t find any, so made the return trip home. It was a shame that we could not fly low enough to obtain turbulent flux measurements, but the flight was still useful for obtaining profiles of wind structure in the boundary layer, and for our understanding of forecast performance in the region.

Photos taken from the Twin Otter aircraft 500 ft above the surface, with a layer of Arctic sea smoke overlaying the ice floes of the marginal ice zone. Here visibility is too low to descend any further. Photos from Hannah Croad.
Flying over the marginal ice zone at 70 ft in good visibility conditions, with the shadow of the Twin Otter aircraft visible. Video from John Methven.

During the month-long field campaign a total of 17 science flights were conducted, flying in all directions from Longyearbyen, with an accumulated 80 hours of flying time. This included 4 Arctic cyclone cases, and 7.5 hours of surface layer turbulent flux measurements (more than we could have hoped for!). The data from the aircraft is currently undergoing quality control. Analysis will now proceed in two streams:

  1. Run simulations of Arctic cyclone cases in NWP models, evaluating against field campaign observations and using various tools to relate surface friction and heating to cyclone evolution (led by the University of Reading team)
  2. Use observations of turbulent fluxes in the surface layer over the marginal ice zone and sea ice properties to improve the representation of turbulent exchange over sea ice – i.e. develop parametrizations (led by the University of East Anglia team)

Building on the outputs and findings from these two work packages, we will then run sensitivity experiments of Arctic cyclones in NWP models, using the revised turbulent exchange parametrizations, to understand the impact on cyclone development.

A summary of all the science flights conducted during the Arctic Summer-time Cyclones field campaign. Flight routes are coloured blue-yellow, indicating flight altitude. Also plotted is the campaign mean sea ice fraction (AMSR2).

I really enjoyed my time on the field campaign, and I learnt a lot! It was great to help the team with forecasting and flight planning, and to be on a science flight. I also got to do a bit of media work, talking on BBC Radio 4’s Inside Science programme (https://www.bbc.co.uk/programmes/m0019z2y). It was a fantastic experience, and now the team and I are looking forward to getting started with the analysis and using the data!

Arctic Summer-time Cyclones field campaign team (some missing) in front of the Twin Otter aircraft. Photo from Dan Beeden.

Urban observations in Berlin

Martina Frid – m.a.h.frid@pgr.reading.ac.uk

Beth Saunders – bethany.saunders@pgr.reading.ac.uk

Introduction 

With a large (and growing) proportion of the global population living in cities, research undertaken in urban areas is important; especially in hazardous situations (heatwaves, flooding, etc), which become more severe and frequent due to climate change.  

This post gives an overview of recent work done for The urbisphere; a Synergy Project funded by the European Research Council (urbisphere 2021), aiming to forecast feedbacks between weather, climate and cities.  

Berlin Field Campaign 

The project has included a year-long field campaign (Autumn 2021 – Autumn 2022) undertaken in Berlin (Fig. 1). A smart Urban Observation System was used to take measurements across the city. Sensors used include ceilometers, Doppler wind LIDARs, radiometers, thermal cameras, and large aperture scintillometers (LAS). These measurements were taken to provide new information about the impact of Berlin (and other cities) on the urban boundary layer. The unique observation network was able to provide dense, multi-scale measurements, which will be used to evaluate and inform weather and climate models.  

Figure 1: Locations of the urbisphere senors in Berlin, Germany (urbisphere 2021).

Large Aperture Scintillometry in Berlin

The Berlin field campaign has included 6 LAS paths (Fig. 1). LAS paths consist of a transmitter and receiver mounted in the free atmosphere (Fig. 2), 0.5 – 5 km apart (e.g. Ward et al. 2014).

A beam of near-infrared radiation (wavelength of ~ 850 nm) is passed from the transmitter to receiver, where the beam intensity is measured. Changes in the refractive index of air are used to derive turbulent sensible heat flux. As the received intensity is the result of fluctuations all along the beam, derived quantities are spatially-integrated, and are therefore at a larger-scale compared to other flux measurement techniques (e.g. eddy-covariance).

Figure 2: One of six large aperture scintillometer path (orange) transects. Ground height (blue) is shown between the receiver site (GROP) and transmitter site (OSWE) in Berlin. The Path’s effective beam height is 50 m above ground level.

Our Visit to Berlin

During the first week of August, we travelled to Berlin for three days of fieldwork, to prepare for an intense observation period (IOP). This trip included us installing sensors, and testing they worked as expected. We visited three observation sites: GROP (123 m above sea level, Fig. 2), OSWE (63 m, Fig. 2) and NEUK (60 m).

One of the main purposes of this visit was to align two of the LAS paths (including the one in Fig. 2). Initially, work is undertaken at the transmitter site (Fig. 3, top) to point the instrument in the approximate direction of the receiver using a sight (Fig. 3, right hand side photographs).

At the receiver site (Fig. 3, bottom), the instrument’s measurement of signal strength can be displayed on a monitor in real time. Using this output as a guide, small adjustments to the receiver’s alignment are made by loosening or tightening two bolts on the mount; one which adjusts the receiver’s pitch, and one with adjusts the yaw. This was carried out until we reached a peak reading in signal strength, indicating the path was aligned.

Figure 3: Photographs of the large aperture scintillometer transmitter at site OSWE (top) and receiver at site GROP (bottom).

Our contribution to the IOP

Back in Reading, daily weather forecasts were carried out for the IOP, to determine when ground-based observations could be made. As the field campaign coincided with the central European heat wave, some of the highest temperatures were recorded during the IOP, and there was a need to forecast thunderstorm and the possibility of lightning strikes.

Ideal conditions for observations were clear skies and a consistent wind direction with height. A variety of different wind directions during the IOP was also preferable, to capture different transects of Berlin. For the selected days, group members in Berlin deployed multiple weather balloons simultaneously across multiple sites within the city and the outskirts. This was also timed with satellite overpasses. Observations of the mixing layer height (urban and suburban) were taken using a ceilometer mounted in a van, which drove along different transects of Berlin.

As the field campaign is wrapping up in Berlin, several instruments are now being moved to the new focus city: Paris. We are looking forward to this new period of interesting observations! Thank you and goodbye from us at the top of the GROP observation site!

References

urbisphere, 2021: Project Context and Objectives. http://urbisphere.eu/ (accessed 27/09/22)

Ward, H. C., J. G. Evans, and C. S. B. Grimmond, 2014: Multi-Scale Sensible Heat Fluxes in the Suburban Environment from Large-Aperture Scintillometry and Eddy Covariance. Boundary-Layer Meteorol., 152, 65–89.

Science Stand-up: Putting those Met Panto Skills to Good Use 

Max Coleman – m.r.coleman@pgr.reading.ac.uk  

A stage and red curtain

I’ve been keen to ‘do my bit’ for climate science communication for a while now. While I do like attending a good public lecture or seminar, I wanted to try something a bit different, particularly something I could bring my love of comedy into. So, when a science stand-up comedy event was pointed out to me (thanks to Tara Bryer of Climate Outreach!) I thought I’d give it a go. 

The event in question is ‘Science Showoff’, an event designed to communicate science via comedy. It’s held on the last Wednesday every month in London, currently held at The Harrison near Kings Cross station, and has been running for over 10 years. And it’s open to absolutely anyone to perform – no comedic credentials required. The only rules are it’s 9-minute sets, must be about something STEM related, and should (hopefully) be funny!  

I performed in the August event and decided to base my set broadly on my research field of modelling the effects of aerosols on climate. Basing the set on my research made it slightly easier as I knew the science content already and just needed to write the comedy – though one can definitely go for more adventurous topics. While to a non-scientist that might sound a bit dry, it’s actually not too difficult to come up with jokes about climate science – as anyone who’s helped write a Met Panto script will surely know.  

For example, framing it as an explanation of my hatred of something as innocuous as deodorant (which as it turns out, makes a decent low-effort physical demonstration of aerosols) seemed a good way to make content easier to understand and line up some more relatable jokes. Having a physical prop, even as simple as a deodorant can, also turned out to be an easy way to ‘wow’ the audience (they set a very low bar indeed for being impressed by my ‘live science’). There’s also a wealth of jokes from being a climate ‘modeller’ – you’ve just got to work it 😉 

On the day, while I was very nervous before the event and into the first minute or two of my set, after that it was great fun. The audience, of about 30 people, were incredibly friendly and the host, Steve, was very supportive. After all, while you’re there for comedy, there’s not much pressure as many of the acts (myself included) have never performed stand-up comedy before. The set mostly went to plan, though I did add a little improvisation in response to audience reactions when they liked a joke more than I’d expected, and when audience members were reluctant to participate – who’d have thought leading one of them into a joke at their expense would make the others so reluctant? It was also a lot of fun going from being an audience member worried about being picked on, to the one who gets to pick on people – the audience engagement was definitely the most enjoyable part.  

It was also huge fun just writing the set. I didn’t set myself loads of pressure, just occasionally thinking of jokes while walking or on the train and making a note of it, and then put it all together the weekend before and rehearsed the evening before. Again, if you’re ever helped write the Panto script or Sappo email, you’ll know how much fun this all can be (although I’m now regretting not getting pizza in while I wrote it).  

And as a bonus, I got to listen to the other five acts perform, sometimes riffing off my jokes too! We had everything from penguins in the Antarctic to the most embarrassing lab accidents you could imagine. The acts were by people from a range of scientific disciplines and backgrounds including PhD students, a lecturer, and a professional science communicator. 

I can’t say much more to describe the experience itself, but if you want an idea of what it’s like, you can check out some recorded previous sets (while there is some rather questionable footage of my own act, there is not a chance I’m sharing it here – I’m not that confident :P). Or of course, go attend the next Science Showoff or a similar science comedy event. 

What I would say though is if you also want to do climate science communication (or try a different format for it) and are a fan of comedy (looking at any and all Met Panto-ers especially here) then you should consider giving this a go! Yes, even if you’ve never done stand-up comedy before… I mean it can’t be more embarrassing than acting out a lecturer in Panto while they watch! 

Any questions about the experience or want to be persuaded to give it a try??? Feel free to comment or email me 🙂 

EGU 2022 

Charlie Suitters – c.c.suitters@pgr.reading.ac.uk 

Isabel Smith – i.h.smith@pgr.reading.ac.uk 

Brian Lo – brian.lo@pgr.reading.ac.uk 

What is EGU22? 

With more events resuming as in-person, the European Geosciences Union General Assembly 2022 (EGU22) was no exception. The European Geoscience Union General Assembly is one of the big annual conferences for Earth sciences. For some of us, EGU22 was our first in-person conference overseas, which made it both an exciting and eye-opening experience! This year, 12,332 abstracts were presented with the participation of 7,315 colleagues from 89 countries on-site in Vienna, accompanied by 7,002 virtual attendees from 116 countries. 

Venue of EGU22 – Vienna International Centre

With 791 sessions running throughout the week, working out our personal schedule was a challenge. Luckily, EGU had an online tool we used to add talks to a personal programme, without having to distribute printed programmes. Due to COVID restrictions, all presentations at EGU22 had the same format as short orals. These presentations were delivered and viewed both in-person and online in a hybrid format. Most talks were limited to 5 minutes, which meant it was not the easiest to summarise our work and also deliver effective science to the audience. 

Isabel Smith giving her 5-minute talk at the High-resolution weather and climate simulation session

What is a typical day like at EGU22? 

If you planned to attend an 8.30am session in the morning, then you would have had to take the U-Bahn to the conference centre, crossing your fingers there would be no breakdowns. Most sessions lasted for one and a half hours, consisting of between 15 and 20 presentations with some time for questions and discussion. There were coffee breaks between sessions, where we could recharge with a free flow of coffee and tea.  

A variety of short courses were also on offer, such as “Writing the IPCC AR6 Report: Behind the Scenes” or “Thermodynamics and energetics of the oceans, atmosphere and climate” co-convened by Remi Tailleux from our department. If you are likely to attend this conference in the future, sign up to the EGU newsletter, here you could see further details about the short courses and the EGU staff’s top sessions of the day.  

There was also a large exhibition hall featuring publishing companies and geoscience companies, some of which offered freebies like pens and notebooks. Outside the main exhibition halls, there were picnic benches, usually filled with conference attendees enjoying lunch or an afternoon beer after a full day of conferencing. 

What did we do other than the conference? 

Although there was an impressive showcase of presentations and networking at the 5-day long EGU, we also went sightseeing in and around Vienna. Some of us would take the opportunity of having an extended lunch break to take the U-Bahn to the centre of the city, or an afternoon off to explore a museum, or visit the Donauturm (Danube Tower) for an amazing if windy view of the city. 

We also enjoyed the dinners after long conference days, especially on the night when we filled ourselves with schnitzel larger than the size of our face and had late-night gelato after a few drinks. A few of us stayed over the weekend and visited the outskirts of the city, such as the Schönbrunn Palace and a free panoramic view of Vienna at the top of Leopoldsberg! 

Having met many familiar faces and networked with others in our field, EGU22 was a “Wunderbar” experience we would definitely recommend, especially in person! It is also a great excuse to practise your GCSE German. Just remember the phrase “Können wir die/der Rechnung/Kassenzettel haben, bitte?” if you want to claim back your meals and other expenses from the trip! 

Dinner gathering of past and present members of the University of Reading at EGU22

Met Office Climate Data Challenge 2022

Daniel Ayers – d.ayers@pgr.reading.ac.uk  

The Met Office Climate Data Challenge 2022 was a two day virtual hackathon-style event where participants hacked solutions to challenges set by Aon (Wikipedia: “a British-American multinational professional services firm that sells a range of financial risk-mitigation products, including insurance, pension administration, and health-insurance plans”) and the Ministry of Justice (MoJ). Participants heralded from the Met Office and the universities of Reading, Bristol, Oxford, Exeter, Leeds and UCL. Here’s how I found the experience and what I got out of it. 

If your PhD experience is anything like mine, you feel pretty busy. In particular, there are multitudinous ways one can engage in not-directly-your-research activities, such as being part of the panto or other social groups, going to seminars, organising seminars, going to conferences, etc. Obviously these can all make a positive contribution to your experience – and seminars are often very useful – but my point is: it can sometimes feel like there are too few periods of uninterrupted time to focus deeply on actually doing your research. 

Fig. 1: There are many ways to be distracted from actually doing your research. 

So: was it worth investing two precious days into a hackathon? Definitely. The tl;dr is: I got to work with interesting people, I got an experience of working on a commercial style project (very short deadline for the entire process from raw data to delivered product), and I got an insight into the reinsurance industry. I’ll expand on these points in a bit. 

Before the main event, the four available challenges were sent out a few weeks in advance. There was a 2hr pre-event meeting the week beforehand. In this pre-meeting, the challenges were formally introduced by representatives from Aon and MoJ, and all the participants split into groups to a) discuss ideas for challenge solutions and b) form teams for the main event. It really would have helped to have done a little bit of individual brainstorming and useful-material reading before this meeting.  

As it happened, I didn’t prepare any further than reading through the challenges, but this was useful. I had time to think about what I thought I could bring to each challenge, and vaguely what might be involved in solutions to each challenge. I concluded that the most appropriate challenge for me was an Aon challenge about determining how much climate change was likely to impact insurance companies through changes to the things insurance companies insure (as opposed to, for example, the frequency or intensity of extreme weather events which might cause payouts to be required). In the pre-meeting, someone else presented an idea that lined up with what I wanted to do: model some change in earth and human systems and use this to create new exposure data sets (for exposure data set, read “list of things the insurance companies insure for, and how much a full payout will cost”). This was a lofty ambition, as I will explain. Regardless, I signed up to this team and I was all set for the main two-day event. 

Here are some examples of plots that helped us to understand the exposure data set. We were told, for example, that for some countries, a token lat-lon coordinate was used for all entries in that country. This resulted in some lat-lon coords being used with comparatively high frequency, despite the entries potentially describing large or distinct areas of land.  

The next two plots show the breakdown of the entries by country, and then by construction type. Each entry is for a particular set of buildings. When modelling the likely payout following an event (e.g. a large storm) it is useful to know how the buildings are made. 

One thing I want to mention, in case the reader is involved with creating a hackathon at any point, is the importance of challenge preparation. The key thing is that participants need to be able to hit the ground running in the event itself. Two things are key to this being possible.  

First, the challenge material should ideally provide a really good description of the problem space. In our case, we spent half of the first day in a meeting with the (very helpful) people from Aon, picking their brains about how the reinsurance industry worked, what they really cared about, what would count as an answer to this question, what was in the mysterious data set we had been given and how should the data be interpreted. Yes, this was a great opportunity to learn and have a discussion with someone I would ordinarily never meet, but my team could have spent more precious hackathon hours making a solution if the challenge material had done a better job of explaining what was going on.  

Second, any resources that are provided (in our case, a big exposure data set – see above), need to be ready to use. In our case, only one person in some other team had been sent the data set, it wasn’t available before the main event started, there was no metadata, and once I managed to get hold of it I had to spend 2-3 hours working out which encoding to use and how to deal with poorly-separated lines in the .csv file. So, to all you hackathon organisers out there: test the resources you provide, and check they can be used quickly and easily.  

By the end of the second day, we’d not really got our envisioned product working. I’d managed to get the data open at last, and done some data exploration plots, so at least we had a better idea of what we were playing with. My team mates had found some really useful data for population change, and for determining if a location in our data set was urban or rural. They had also set up a slack group so that we could collaborate and discuss the different aspects of the problem, and a GitHub repo so we could share our progress (we coded everything in Python, mainly using Jupyter notebooks). We’d also done a fair amount of talking with the experts from Aon, and amongst ourselves as a team, to work out what was viable. This was a key experience from the event: coming up with a minimal viable product. The lesson from this experience was: be ok with cutting a lot of big corners. This is particularly useful for me as a PhD student, where it can be tempting to think I have time to go really deep into optimising and learning about everything required. My hackathon experience showed how much can be achieved even when the time frame forces most corners to be cut. 

To give an example of cutting corners, think about how many processes in the human-earth system might have an effect over the next 30 years on what things there are to insure, where they are, and how much they cost. Population increase, urbanisation and ruralisation, displacement from areas of rising water levels or increased flooding risk, construction materials being more expensive in order to be more environmentally friendly, immigration, etc. Now, how many of these could we account for in a simplistic model that we wanted to build in two days? Answer: not many! Given we spent the first day understanding the problem and the data, we only really had one day, or 09:45 – 15:30, so 5 hours and 45 minutes, to build our solution. We attempted to account for differences in population growth by country, by shared socio-economic pathway, and by a parameterised rural-urban movement. As I said, we didn’t get the code working by the deadline, and ended up presenting our vision, rather than a demonstration of our finished solution. 

There might be an opportunity to do more work on this project. A few of the projects from previous years’ hackathons have resulted in publications, and we are meeting shortly to see whether there is the appetite to do the same with what we’ve done. It would certainly be nice to create a more polished piece of work. That said, preserving space for my own research is also important! 

As a final word on the hackathon: it was great fun, and I really enjoyed working with my team.  PhD work can be a little isolated at times, so the opportunity to work with others was enjoyable and motivating. Hopefully, next time it will be in person. I would recommend others to get involved in future Met Office Climate Data Challenges! 

Deploying an Instrument to the Reading University Atmospheric Observatory 

Caleb Miller – c.s.miller@pgr.reading.ac.uk 

In the Reading area, December and January seem to be prime fog season. Since I’m studying the effects of fog on atmospheric electricity, that means that winter is data collection season! However, in order to begin collecting data in the first year of my PhD, there was only a short amount of time to prepare an instrument and deploy it to the observatory before Christmas. 

One of the instruments that I am using to measure fog is called the Optical Cloud Sensor (OCS). It was designed by Giles Harrison and Keri Nicoll, and it is described in more detail in this paper: (Harrison and Nicoll 2014). The OCS has four channels of LEDs which shine light into the surrounding air. When fog is present, the fog droplets scatter light back to the instrument, where the intensity from each channel can be measured. 

Powering the instrument 

The OCS was originally designed to be flown on a weather balloon, which meant that it was meant to be powered by battery and run for only short periods of time. In my case, however, I wanted the device to be able to continuously collect data over a period of weeks or months without interruption. Then, we would be able to catch any fog events, even if they hadn’t been forecasted. That meant the device would need to be powered by the +15V power supply available at the observatory, and my first step was to create a power adapter for the OCS so that this would be possible. 

Initially, I had been considering using an Arduino microcontroller as a datalogger, so I decided to put together a power adapter on an Arduino shield (a small electronic platform) for maximum convenience. I included multiple voltage levels on my power adapter and connected them to different power inputs on the OCS. Once this was completed, the entire system could now be powered with a single power supply that was available at the observatory! 

I was able to find all of the required parts for the power supply in stock in the laboratory in the Meteorology Department, and I soldered it together in a few days. The technical staff of the university were very helpful in this process! A photograph of the power adapter connected to an Arduino is shown in Figure 1. 

Figure 1. The power adapter for the optical cloud sensor, built on an Arduino shield 

Storing data from the instrument 

Once the power supply had been created, the next step was setting up a datalogging system. On a balloon, the data would be streamed in real-time down to a ground station by radio link. But when this system was deployed to the ground, that would no longer be necessary. 

Instead, I decided to use a CR1000X datalogger from Campbell Scientific. This system has a number of voltage inputs which can be programmed using a graphical interface over a USB connection, and it has a port for an SD card. I programmed the datalogger to sample each of the four analog channels coming from the OCS every five seconds and to store the measurements on an SD card. Collecting the measurements was then as simple as removing the SD card from the datalogger and copying the data to my laptop. This could be done without interrupting the datalogger, as it has its own internal storage, and it would continue measuring while the SD card was removed. 

I had also considered simultaneously logging a digital form of the measurements to an Arduino in addition to the analog measurements made by the datalogger. This would give us two redundant logging systems which would decrease the chances of losing valuable information in the event of an instrument malfunction. However, due to a shortage of time and a technical issue with the instrument’s digital channels, I was unable to prepare the Arduino logger by the time we were ready to deploy the OCS, so we used only the analog datalogger. 

Figure 2. The OCS with its new power supply being tested in the laboratory 

Deploying the instrument 

Once the power supply and datalogger were completed, the instrument was ready to be deployed! It was a fairly simple process to get approval to put the instrument in the observatory; then I met with Ian Read to find a suitable location to set up the OCS. There were several posts in the observatory which were free, and I chose one which was close to the temperature and humidity sensors in the hopes that the conditions would be fairly similar in those locations. Once everything was ready, the technicians and I took the OCS and datalogger and set it up in the field site. At first, when we powered it on, nothing happened. Apparently, one of the solder joints on my power adapter had been damaged when I carried it across campus. However, I resoldered that connection with advice from the university technical staff, and it worked beautifully! 

Figure 3. The datalogger inside its enclosure in the observatory 

Figure 4. The OCS attached to its post in the observatory  

Except for a short period of maintenance in January, the OCS has been running continuously from December until May, and it has already captured quite a few fog events! With the data from the OCS, I now have an additional resource to use in analyzing fog. The levels of light backscattered from the four channels of the instrument provide interesting information, which I am combining with electrical and visibility measurements to analyze the microphysical properties of fog development. 

Hopefully, over the next year, we will be able to measure many more fog events with this instrument that will help us to better understand fog! 

Harrison, R. G., and K. A. Nicoll, 2014: Note: Active optical detection of cloud from a balloon platform. Rev. Sci. Instrum., 85, 066104, https://doi.org/10.1063/1.4882318. 

Climate Resilience Evidence Synthesis Training 

Lily Greig – l.greig@pgr.reading.ac.uk 

The Walker Academy, the capacity strengthening arm of the Walker Institute, based at the University of Reading, holds a brilliant week-long training course every year named (Climate Resilience Evidence Synthesis Training (CREST). The course helps PhD students from all disciplines to understand the role of academic research within wider society. I’m a third year PhD student studying ocean and sea ice interaction, and I wanted to do the course because I’m interested in understanding how to better communicate scientific research, and the process of how research is used to inform policy. The other students who participated were mainly from SCENARIO or MPECDT, studying a broad range of subjects from Agriculture to Mathematics.  

The Walker Institute  

The Walker Institute is an interdisciplinary research institute supporting the development of climate resilient societies. Their research relates to the impacts of climate variability, which includes social inequality, conflict, migration and loss of biodiversity. The projects at Walker involve partnership with communities in low-income countries to increase climate resilience on the ground. 

The institute follows a system-based approach, in which project stakeholders (e.g., scientists, village duty bearers, governments and NGOs) collaborate and communicate continuously, with the aim of making the best decisions for all. Such an approach allows, for example, communities on the ground (such as a village in North East Ghana affected by flooding) to vocalise their needs or future visions, meaning scientific research performed by local or national Meteorological agencies can be targeted and communicated according to those specific needs. Equally, with such a communication network, governments are able to understand how best to continually enforce those connections between scientists and farmers, and to make the best use of available resources or budgets. This way, the key stakeholders form part of an interacting, constantly evolving complex system. 

Format and Activities 

The course started off with introductory talks to the Walker’s work, with guest speakers from Malawi (Social Economic Research and Interventions Development) and Vietnam (Himalayan University Consortium). On the second day, we explored the topic of communication in depth, which included an interactive play, based on a negotiation of a social policy plan in Senegal. The play involved stepping on stage and improvising lines ourselves when we spotted a problem in negotiations. An example of this was a disagreement between two climate scientists and the social policy advisor to the President- the scientists knew that rainfall would get worse in the capital, but the social scientist understood that people’s livelihoods were actually more vulnerable elsewhere. Somebody stepped in and helped both characters understand that the need for climate resilience was more widespread than each individual character had originally thought.  

Quick coffee break after deciphering the timeline of the 2020 floods in North East Ghana.

The rest of the week consisted of speedy group work on our case study of increasing climate resilience to annual flood disasters in North East Ghana, putting together a policy brief and presentation. We were each assigned a stakeholder position, from which we were to propose future plans. Our group was assigned the Ghanaian government. We collected evidence to support our proposed actions (for example, training Government staff on flood action well in advance of a flood event, as not as an emergency response) and built a case for why those actions would improve people’s livelihoods. 

Alongside this group work, we had many more valuable guest speakers. See the full list of guest speakers below. Each guest gave their own unique viewpoint of working towards climate resilience. 

List of guest speakers 

Day 1: Chi Huyen Truong: Programme Coordinator Himalayan University Consortium, Mountain Knowledge and Action Networks 

Day 1: Stella Ngoleka: Country Director at Social Economic Research and Interventions Development – SERID and HEA Practitioner  

Day 2: Hannah Clark: Open Source Farmer Radio Development Manager, Lorna Young Foundation 

Day 2: Miriam Talwisa: National Coordinator at Climate Action Network-Uganda 

Day 3: panel speakers:  

Irene Amuron: Program Manager, Anticipatory Action at Red Cross Red Crescent Climate Centre 

Gavin Iley: International Expert, Crisis Management & DRR at World Meteorological Organization 

James Acidri: Former member of the Ugandan Parliament, Senio associate Evidence for Development 

Day 4: Tesse de Boer: Technical advisor in Red Cross Red Crescent Climate Centre 

Day 5: Peter Gibbs: Freelance Meteorologist & Broadcaster 

Course Highlights 

Everyone agreed that the interactive play was a highly engaging & unusual format, and one not yet encountered in my PhD journey! It allowed some of us to step right into the shoes of someone whose point of view you had potentially never stopped to consider before, like a government official or a media reporter… 

The 2022 CREST organisers and participants. Happy faces at the end of an enjoyable course!

Something else that really stayed with me was a talk given by the National Coordinator at Climate Action Network Uganda, Miriam Talwisa. She shared loads of creative ideas about how to empower climate action in small or low-income communities. These included the concept of community champions, media cafes, community dialogues, and alternative policy documentation such as citizens manifestos or visual documentaries. This helped me to think about my own local community and how such tools could be implemented to enforce climate action at the grassroots level.  

Takeaways  

An amazing workshop with a lovely and supportive team running it who built a real atmosphere. I took away a lot from the experience and I think the other students did too. It really helped us to think about our own research and our key stakeholders, and how reaching out to them is really important. 

Physical climate storylines of UK drought 

Wilson Chan – wilson.chan@pgr.reading.ac.uk  

Hydrological droughts are periods of below normal river flows. These events negatively impact public water supply and the natural environment. The UK is commonly perceived as wet and rainy with low risk of water supply shortages. A recent report explored the “Great British Rain Paradox” by showing that this perception does not hold true given past severe droughts and vulnerability to future droughts under climate change. The latest UKCP18 projections suggest the potential for more frequent and intense droughts across the UK.  

“Top-down” and “bottom-up” approaches 

There has been a lot of research carried out on the possible impacts of climate change on river flows in the UK. In our recently published review paper, we reviewed over 100 papers published over the past three decades and found that there is relative certainty among studies over a possible reduction in summer river flows for catchments across the UK. There is also evidence to suggest that slow-responding groundwater-dominated catchments in the southeast, particularly important for public water supplies, may experience a reduction in river flows across all seasons.  

There remains considerable uncertainty over the magnitude of change and the temporal evolution of future droughts. In our review, we find that studies following a traditional “top-down” assessment approach may not be able to fully address key research gaps. Most of the papers we reviewed followed this approach where output from global climate models (GCMs) are fed through hydrological models of varying complexities to simulate river flows (Figure 1a). This approach often aims to analyze as many components within the impact modelling chain as possible and incurs the cascade of uncertainty (Figure 1b). Outcomes depend on the many choices made along the way (e.g. climate models, emission scenarios, hydrological models etc.) which often results in wide uncertainty ranges that are not conducive to decision-making. A large part of this uncertainty is due to differences in the atmospheric circulation response to climate change across different climate models. Studies following a “top-down” approach are therefore limited when considering plausible worst-cases (low likelihood, high impact outcomes) and the information produced often cannot be easily used in practical water resources planning. 

Figure 1 (a) Studies on the impacts of climate change on UK river flows categorized into four modelling approaches with most of the reviewed studies following top-down GCM-driven and probabilistic approaches (b) The cascade of uncertainty incurred by “top-down” GCM-driven and probabilistic studies (Source: Wilby and Dessai 2010).

We also identified several approaches that have been developed to address drawbacks of “top-down” approaches. They do not seek to replace the traditional “top-down” approaches but instead aim to explore “top-down” projections from a wider “bottom-up” framework. For example, the scenario-neutral approach does not rely on GCM simulations and explores the sensitivity of hydrological systems to a much wider range of plausible futures. The storyline approach is another example of approaches designed to explicitly understand plausible worst cases and navigate the uncertainty cascade from a decision-making context. Storylines can be seen as plausible pathways conditioned on a discrete set of changes (e.g. in atmospheric circulation, management measures or event characteristics). They are informed by multiple lines of evidence (incl. process understanding, historical reconstructions and traditional GCM projections).  

Storylines 

The second paper of my PhD, published in Hydrology and Earth Systems Sciences, demonstrates how the storyline approach can be applied to understand UK droughts. We used an observed event, the 2010-12 drought, as the basis for developing a range of storylines. The drought is one of the top 10 most significant multi-year UK droughts. Temporary water use restrictions affected 20 million customers and drought conditions led to agricultural and industrial losses of over GBP400 million. The drought was characterized by two consecutive dry winters and terminated rapidly in early 2012 with record-breaking rainfall over spring 2012. Motivated by a series of “what-if” questions, we created downward counterfactual storylines of the 2010-12 drought to reimagine how the event could have turned out worse. 

 In our study, we created storylines quantifying what would happen if… 

  1. Hydrological preconditions of the drought were drier 
  1. Continued dry conditions persisted from a third dry winter instead of the observed rapid drought termination  
  1. The drought was to unfold in a warmer climate.  

We showed that the 2010-12 drought was highly influenced by catchment preconditions. Storylines of drier preconditions showed that catchment preconditions prior to drought inception aggravated drought conditions for some of the most affected catchments. Progressively drier preconditions could have led to short but more intense conditions for fast responding catchments in Scotland and a lag and lengthening of drought conditions in slow responding catchments in lowland England.  

The observed 2010-12 drought was characterized by two consecutive dry winters. Weather forecasts and water companies at the time widely anticipated dry conditions to continue through 2012. The prospect of three consecutive dry winters is a well-known concern in the water resources industry and can lead to significant reduction in reservoir storage. This is especially important for slow-responding catchments as groundwater reserves are normally recharged during winter. Storylines of the 2010-12 drought given an additional dry year with dry winter conditions either before or after the observed drought showed the vulnerability of catchments to a “three dry winters” situation. Figure 2 shows that drought conditions could still have intensified with even lower river flows for catchments that were already the most affected.  

Figure 2 Standardized streamflow index (SSI) over the 2010-12 drought (black) and storylines of an additional dry winter before (red) and after (blue) the observed drought for four example river catchments in SE England. Lower values of the SSI indicate below average river flows over the accumulation period (6-months; SSI-6 is shown here), and vice versa.

Applying the UKCP18 regional climate projections to the observed 2010-12 drought sequence, drought conditions are projected to worsen with temperature rise. Notably, the magnitude of change is lower for catchments in western Scotland due to the compensating effects of wetter winters in general although summer months are projected to become drier with temperature rise. Benchmark severe droughts such as the 1975-76 and the 1989-92 droughts are regularly used to test the feasibility of water management plans. Given a third dry winter or a >2°C temperature rise, the different counterfactual storylines of the 2010-12 drought could have led to worse conditions than both the selected benchmark droughts (Figure 3 for slow-responding catchments in southeast England relative to the 1989-92 drought).  

Figure 3 Comparison of the various storylines with the 1988-92 drought which was particularly severe for slow-responding catchments in SE England (especially in East Anglia). The plot shows change in mean deficit and maximum intensity for each storyline relative to the 1988-92 drought. SSI accumulated over longer periods (e.g. SSI-24) is more indicative of prolonged drought conditions in slow-responding catchments compared to SSI-6.

Event storylines created from plausible alterations made to past observed droughts can help water resources planners stress test hydrological systems against unrealised droughts. “Bottom-up” approaches exploring specific conditions relevant to water resources planning (e.g. three dry winters) can complement traditional “top-down” projections to better understand worst-cases and consider how future extreme droughts can unfold.

Chan, W.C.H., Shepherd, T.G., Facer-Childs, K., Darch, G., Arnell, N.W., 2022a. Tracking the methodological evolution of climate change projections for UK river flows. Progress in Physical Geography: Earth and Environment 030913332210792. https://doi.org/10.1177/03091333221079201  

Chan, W.C.H., Shepherd, T.G., Facer-Childs, K., Darch, G., Arnell, N.W., 2022b. Storylines of UK drought based on the 2010–2012 event. Hydrology and Earth System Sciences 26, 1755–1777. https://doi.org/10.5194/hess-26-1755-2022