With more events resuming as in-person, the European Geosciences Union General Assembly 2022 (EGU22) was no exception. The European Geoscience Union General Assembly is one of the big annual conferences for Earth sciences. For some of us, EGU22 was our first in-person conference overseas, which made it both an exciting and eye-opening experience!This year, 12,332 abstracts were presented with the participation of 7,315 colleagues from 89 countries on-site in Vienna, accompanied by 7,002 virtual attendees from 116 countries.
With 791 sessions running throughout the week, working out our personal schedule was a challenge. Luckily, EGU had an online tool we used to add talks to a personal programme, without having to distribute printed programmes. Due to COVID restrictions, all presentations at EGU22 had the same format as short orals. These presentations were delivered and viewed both in-person and online in a hybrid format. Most talks were limited to 5 minutes, which meant it was not the easiest to summarise our work and also deliver effective science to the audience.
What is a typical day like at EGU22?
If you planned to attend an 8.30am session in the morning, then you would have had to take the U-Bahn to the conference centre, crossing your fingers there would be no breakdowns. Most sessions lasted for one and a half hours, consisting of between 15 and 20 presentations with some time for questions and discussion. There were coffee breaks between sessions, where we could recharge with a free flow of coffee and tea.
A variety of short courses were also on offer, such as “Writing the IPCC AR6 Report: Behind the Scenes” or “Thermodynamics and energetics of the oceans, atmosphere and climate” co-convened by Remi Tailleux from our department. If you are likely to attend this conference in the future, sign up to the EGU newsletter, here you could see further details about the short courses and the EGU staff’s top sessions of the day.
There was also a large exhibition hall featuring publishing companies and geoscience companies, some of which offered freebies like pens and notebooks. Outside the main exhibition halls, there were picnic benches, usually filled with conference attendees enjoying lunch or an afternoon beer after a full day of conferencing.
What did we do other than the conference?
Although there was an impressive showcase of presentations and networking at the 5-day long EGU, we also went sightseeing in and around Vienna. Some of us would take the opportunity of having an extended lunch break to take the U-Bahn to the centre of the city, or an afternoon off to explore a museum, or visit the Donauturm (Danube Tower) for an amazing if windy view of the city.
We also enjoyed the dinners after long conference days, especially on the night when we filled ourselves with schnitzel larger than the size of our face and had late-night gelato after a few drinks. A few of us stayed over the weekend and visited the outskirts of the city, such as the Schönbrunn Palace and a free panoramic view of Vienna at the top of Leopoldsberg!
Having met many familiar faces and networked with others in our field, EGU22 was a “Wunderbar” experience we would definitely recommend, especially in person! It is also a great excuse to practise your GCSE German. Just remember the phrase “Können wir die/der Rechnung/Kassenzettel haben, bitte?” if you want to claim back your meals and other expenses from the trip!
The Met Office Climate Data Challenge 2022 was a two day virtual hackathon-style event where participants hacked solutions to challenges set by Aon (Wikipedia: “a British-American multinational professional services firm that sells a range of financial risk-mitigation products, including insurance, pension administration, and health-insurance plans”) and the Ministry of Justice (MoJ). Participants heralded from the Met Office and the universities of Reading, Bristol, Oxford, Exeter, Leeds and UCL. Here’s how I found the experience and what I got out of it.
If your PhD experience is anything like mine, you feel pretty busy. In particular, there are multitudinous ways one can engage in not-directly-your-research activities, such as being part of the panto or other social groups, going to seminars, organising seminars, going to conferences, etc. Obviously these can all make a positive contribution to your experience – and seminars are often very useful – but my point is: it can sometimes feel like there are too few periods of uninterrupted time to focus deeply on actually doing your research.
So: was it worth investing two precious days into a hackathon? Definitely. The tl;dr is: I got to work with interesting people, I got an experience of working on a commercial style project (very short deadline for the entire process from raw data to delivered product), and I got an insight into the reinsurance industry. I’ll expand on these points in a bit.
Before the main event, the four available challenges were sent out a few weeks in advance. There was a 2hr pre-event meeting the week beforehand. In this pre-meeting, the challenges were formally introduced by representatives from Aon and MoJ, and all the participants split into groups to a) discuss ideas for challenge solutions and b) form teams for the main event. It really would have helped to have done a little bit of individual brainstorming and useful-material reading before this meeting.
As it happened, I didn’t prepare any further than reading through the challenges, but this was useful. I had time to think about what I thought I could bring to each challenge, and vaguely what might be involved in solutions to each challenge. I concluded that the most appropriate challenge for me was an Aon challenge about determining how much climate change was likely to impact insurance companies through changes to the things insurance companies insure (as opposed to, for example, the frequency or intensity of extreme weather events which might cause payouts to be required). In the pre-meeting, someone else presented an idea that lined up with what I wanted to do: model some change in earth and human systems and use this to create new exposure data sets (for exposure data set, read “list of things the insurance companies insure for, and how much a full payout will cost”). This was a lofty ambition, as I will explain. Regardless, I signed up to this team and I was all set for the main two-day event.
Here are some examples of plots that helped us to understand the exposure data set. We were told, for example, that for some countries, a token lat-lon coordinate was used for all entries in that country. This resulted in some lat-lon coords being used with comparatively high frequency, despite the entries potentially describing large or distinct areas of land.
The next two plots show the breakdown of the entries by country, and then by construction type. Each entry is for a particular set of buildings. When modelling the likely payout following an event (e.g. a large storm) it is useful to know how the buildings are made.
One thing I want to mention, in case the reader is involved with creating a hackathon at any point, is the importance of challenge preparation. The key thing is that participants need to be able to hit the ground running in the event itself. Two things are key to this being possible.
First, the challenge material should ideally provide a really good description of the problem space. In our case, we spent half of the first day in a meeting with the (very helpful) people from Aon, picking their brains about how the reinsurance industry worked, what they really cared about, what would count as an answer to this question, what was in the mysterious data set we had been given and how should the data be interpreted. Yes, this was a great opportunity to learn and have a discussion with someone I would ordinarily never meet, but my team could have spent more precious hackathon hours making a solution if the challenge material had done a better job of explaining what was going on.
Second, any resources that are provided (in our case, a big exposure data set – see above), need to be ready to use. In our case, only one person in some other team had been sent the data set, it wasn’t available before the main event started, there was no metadata, and once I managed to get hold of it I had to spend 2-3 hours working out which encoding to use and how to deal with poorly-separated lines in the .csv file. So, to all you hackathon organisers out there: test the resources you provide, and check they can be used quickly and easily.
By the end of the second day, we’d not really got our envisioned product working. I’d managed to get the data open at last, and done some data exploration plots, so at least we had a better idea of what we were playing with. My team mates had found some really useful data for population change, and for determining if a location in our data set was urban or rural. They had also set up a slack group so that we could collaborate and discuss the different aspects of the problem, and a GitHub repo so we could share our progress (we coded everything in Python, mainly using Jupyter notebooks). We’d also done a fair amount of talking with the experts from Aon, and amongst ourselves as a team, to work out what was viable. This was a key experience from the event: coming up with a minimal viable product. The lesson from this experience was: be ok with cutting a lot of big corners. This is particularly useful for me as a PhD student, where it can be tempting to think I have time to go really deep into optimising and learning about everything required. My hackathon experience showed how much can be achieved even when the time frame forces most corners to be cut.
To give an example of cutting corners, think about how many processes in the human-earth system might have an effect over the next 30 years on what things there are to insure, where they are, and how much they cost. Population increase, urbanisation and ruralisation, displacement from areas of rising water levels or increased flooding risk, construction materials being more expensive in order to be more environmentally friendly, immigration, etc. Now, how many of these could we account for in a simplistic model that we wanted to build in two days? Answer: not many! Given we spent the first day understanding the problem and the data, we only really had one day, or 09:45 – 15:30, so 5 hours and 45 minutes, to build our solution. We attempted to account for differences in population growth by country, by shared socio-economic pathway, and by a parameterised rural-urban movement. As I said, we didn’t get the code working by the deadline, and ended up presenting our vision, rather than a demonstration of our finished solution.
There might be an opportunity to do more work on this project. A few of the projects from previous years’ hackathons have resulted in publications, and we are meeting shortly to see whether there is the appetite to do the same with what we’ve done. It would certainly be nice to create a more polished piece of work. That said, preserving space for my own research is also important!
As a final word on the hackathon: it was great fun, and I really enjoyed working with my team. PhD work can be a little isolated at times, so the opportunity to work with others was enjoyable and motivating. Hopefully, next time it will be in person. I would recommend others to get involved in future Met Office Climate Data Challenges!
In the Reading area, December and January seem to be prime fog season. Since I’m studying the effects of fog on atmospheric electricity, that means that winter is data collection season! However, in order to begin collecting data in the first year of my PhD, there was only a short amount of time to prepare an instrument and deploy it to the observatory before Christmas.
One of the instruments that I am using to measure fog is called the Optical Cloud Sensor (OCS). It was designed by Giles Harrison and Keri Nicoll, and it is described in more detail in this paper: (Harrison and Nicoll 2014). The OCS has four channels of LEDs which shine light into the surrounding air. When fog is present, the fog droplets scatter light back to the instrument, where the intensity from each channel can be measured.
Powering the instrument
The OCS was originally designed to be flown on a weather balloon, which meant that it was meant to be powered by battery and run for only short periods of time. In my case, however, I wanted the device to be able to continuously collect data over a period of weeks or months without interruption. Then, we would be able to catch any fog events, even if they hadn’t been forecasted. That meant the device would need to be powered by the +15V power supply available at the observatory, and my first step was to create a power adapter for the OCS so that this would be possible.
Initially, I had been considering using an Arduino microcontroller as a datalogger, so I decided to put together a power adapter on an Arduino shield (a small electronic platform) for maximum convenience. I included multiple voltage levels on my power adapter and connected them to different power inputs on the OCS. Once this was completed, the entire system could now be powered with a single power supply that was available at the observatory!
I was able to find all of the required parts for the power supply in stock in the laboratory in the Meteorology Department, and I soldered it together in a few days. The technical staff of the university were very helpful in this process! A photograph of the power adapter connected to an Arduino is shown in Figure 1.
Figure 1. The power adapter for the optical cloud sensor, built on an Arduino shield
Storing data from the instrument
Once the power supply had been created, the next step was setting up a datalogging system. On a balloon, the data would be streamed in real-time down to a ground station by radio link. But when this system was deployed to the ground, that would no longer be necessary.
Instead, I decided to use a CR1000X datalogger from Campbell Scientific. This system has a number of voltage inputs which can be programmed using a graphical interface over a USB connection, and it has a port for an SD card. I programmed the datalogger to sample each of the four analog channels coming from the OCS every five seconds and to store the measurements on an SD card. Collecting the measurements was then as simple as removing the SD card from the datalogger and copying the data to my laptop. This could be done without interrupting the datalogger, as it has its own internal storage, and it would continue measuring while the SD card was removed.
I had also considered simultaneously logging a digital form of the measurements to an Arduino in addition to the analog measurements made by the datalogger. This would give us two redundant logging systems which would decrease the chances of losing valuable information in the event of an instrument malfunction. However, due to a shortage of time and a technical issue with the instrument’s digital channels, I was unable to prepare the Arduino logger by the time we were ready to deploy the OCS, so we used only the analog datalogger.
Figure 2. The OCS with its new power supply being tested in the laboratory
Deploying the instrument
Once the power supply and datalogger were completed, the instrument was ready to be deployed! It was a fairly simple process to get approval to put the instrument in the observatory; then I met with Ian Read to find a suitable location to set up the OCS. There were several posts in the observatory which were free, and I chose one which was close to the temperature and humidity sensors in the hopes that the conditions would be fairly similar in those locations. Once everything was ready, the technicians and I took the OCS and datalogger and set it up in the field site. At first, when we powered it on, nothing happened. Apparently, one of the solder joints on my power adapter had been damaged when I carried it across campus. However, I resoldered that connection with advice from the university technical staff, and it worked beautifully!
Figure 3. The datalogger inside its enclosure in the observatory
Figure 4. The OCS attached to its post in the observatory
Except for a short period of maintenance in January, the OCS has been running continuously from December until May, and it has already captured quite a few fog events! With the data from the OCS, I now have an additional resource to use in analyzing fog. The levels of light backscattered from the four channels of the instrument provide interesting information, which I am combining with electrical and visibility measurements to analyze the microphysical properties of fog development.
Hopefully, over the next year, we will be able to measure many more fog events with this instrument that will help us to better understand fog!
The Walker Academy, the capacity strengthening arm of the Walker Institute, based at the University of Reading, holds a brilliant week-long training course every year named (Climate Resilience Evidence Synthesis Training (CREST). The course helps PhD students from all disciplines to understand the role of academic research within wider society. I’m a third year PhD student studying ocean and sea ice interaction, and I wanted to do the course because I’m interested in understanding how to better communicate scientific research, and the process of how research is used to inform policy. The other students who participated were mainly from SCENARIO or MPECDT, studying a broad range of subjects from Agriculture to Mathematics.
The Walker Institute
The Walker Institute is an interdisciplinary research institute supporting the development of climate resilient societies. Their research relates to the impacts of climate variability, which includes social inequality, conflict, migration and loss of biodiversity. The projects at Walker involve partnership with communities in low-income countries to increase climate resilience on the ground.
The institute follows a system-based approach, in which project stakeholders (e.g., scientists, village duty bearers, governments and NGOs) collaborate and communicate continuously, with the aim of making the best decisions for all. Such an approach allows, for example, communities on the ground (such as a village in North East Ghana affected by flooding) to vocalise their needs or future visions, meaning scientific research performed by local or national Meteorological agencies can be targeted and communicated according to those specific needs. Equally, with such a communication network, governments are able to understand how best to continually enforce those connections between scientists and farmers, and to make the best use of available resources or budgets. This way, the key stakeholders form part of an interacting, constantly evolving complex system.
Format and Activities
The course started off with introductory talks to the Walker’s work, with guest speakers from Malawi (Social Economic Research and Interventions Development) and Vietnam (Himalayan University Consortium). On the second day, we explored the topic of communication in depth, which included an interactive play, based on a negotiation of a social policy plan in Senegal. The play involved stepping on stage and improvising lines ourselves when we spotted a problem in negotiations. An example of this was a disagreement between two climate scientists and the social policy advisor to the President- the scientists knew that rainfall would get worse in the capital, but the social scientist understood that people’s livelihoods were actually more vulnerable elsewhere. Somebody stepped in and helped both characters understand that the need for climate resilience was more widespread than each individual character had originally thought.
The rest of the week consisted of speedy group work on our case study of increasing climate resilience to annual flood disasters in North East Ghana, putting together a policy brief and presentation. We were each assigned a stakeholder position, from which we were to propose future plans. Our group was assigned the Ghanaian government. We collected evidence to support our proposed actions (for example, training Government staff on flood action well in advance of a flood event, as not as an emergency response) and built a case for why those actions would improve people’s livelihoods.
Alongside this group work, we had many more valuable guest speakers. See the full list of guest speakers below. Each guest gave their own unique viewpoint of working towards climate resilience.
List of guest speakers
Day 1: Chi Huyen Truong: Programme Coordinator Himalayan University Consortium, Mountain Knowledge and Action Networks
Day 1: Stella Ngoleka: Country Director at Social Economic Research and Interventions Development – SERID and HEA Practitioner
Day 2: Hannah Clark: Open Source Farmer Radio Development Manager, Lorna Young Foundation
Day 2: Miriam Talwisa: National Coordinator at Climate Action Network-Uganda
Day 3: panel speakers:
Irene Amuron: Program Manager, Anticipatory Action at Red Cross Red Crescent Climate Centre
Gavin Iley: International Expert, Crisis Management & DRR at World Meteorological Organization
James Acidri: Former member of the Ugandan Parliament, Senio associate Evidence for Development
Day 4: Tesse de Boer: Technical advisor in Red Cross Red Crescent Climate Centre
Day 5: Peter Gibbs: Freelance Meteorologist & Broadcaster
Everyone agreed that the interactive play was a highly engaging & unusual format, and one not yet encountered in my PhD journey! It allowed some of us to step right into the shoes of someone whose point of view you had potentially never stopped to consider before, like a government official or a media reporter…
Something else that really stayed with me was a talk given by the National Coordinator at Climate Action Network Uganda, Miriam Talwisa. She shared loads of creative ideas about how to empower climate action in small or low-income communities. These included the concept of community champions, media cafes, community dialogues, and alternative policy documentation such as citizens manifestos or visual documentaries. This helped me to think about my own local community and how such tools could be implemented to enforce climate action at the grassroots level.
An amazing workshop with a lovely and supportive team running it who built a real atmosphere. I took away a lot from the experience and I think the other students did too. It really helped us to think about our own research and our key stakeholders, and how reaching out to them is really important.
Hydrological droughts are periods of below normal river flows. These events negatively impact public water supply and the natural environment. The UK is commonly perceived as wet and rainy with low risk of water supply shortages. A recent report explored the “Great British Rain Paradox” by showing that this perception does not hold true given past severe droughts and vulnerability to future droughts under climate change. The latest UKCP18 projections suggest the potential for more frequent and intense droughts across the UK.
“Top-down” and “bottom-up” approaches
There has been a lot of research carried out on the possible impacts of climate change on river flows in the UK. In our recently published review paper, we reviewed over 100 papers published over the past three decades and found that there is relative certainty among studies over a possible reduction in summer river flows for catchments across the UK. There is also evidence to suggest that slow-responding groundwater-dominated catchments in the southeast, particularly important for public water supplies, may experience a reduction in river flows across all seasons.
There remains considerable uncertainty over the magnitude of change and the temporal evolution of future droughts. In our review, we find that studies following a traditional “top-down” assessment approach may not be able to fully address key research gaps. Most of the papers we reviewed followed this approach where output from global climate models (GCMs) are fed through hydrological models of varying complexities to simulate river flows (Figure 1a). This approach often aims to analyze as many components within the impact modelling chain as possible and incurs the cascade of uncertainty (Figure 1b). Outcomes depend on the many choices made along the way (e.g. climate models, emission scenarios, hydrological models etc.) which often results in wide uncertainty ranges that are not conducive to decision-making. A large part of this uncertainty is due to differences in the atmospheric circulation response to climate change across different climate models. Studies following a “top-down” approach are therefore limited when considering plausible worst-cases (low likelihood, high impact outcomes) and the information produced often cannot be easily used in practical water resources planning.
We also identified several approaches that have been developed to address drawbacks of “top-down” approaches. They do not seek to replace the traditional “top-down” approaches but instead aim to explore “top-down” projections from a wider “bottom-up” framework. For example, the scenario-neutral approach does not rely on GCM simulations and explores the sensitivity of hydrological systems to a much wider range of plausible futures. The storyline approach is another example of approaches designed to explicitly understand plausible worst cases and navigate the uncertainty cascade from a decision-making context. Storylines can be seen as plausible pathways conditioned on a discrete set of changes (e.g. in atmospheric circulation, management measures or event characteristics). They are informed by multiple lines of evidence (incl. process understanding, historical reconstructions and traditional GCM projections).
The second paper of my PhD, published in Hydrology and Earth Systems Sciences, demonstrates how the storyline approach can be applied to understand UK droughts. We used an observed event, the 2010-12 drought, as the basis for developing a range of storylines. The drought is one of the top 10 most significant multi-year UK droughts. Temporary water use restrictions affected 20 million customers and drought conditions led to agricultural and industrial losses of over GBP400 million. The drought was characterized by two consecutive dry winters and terminated rapidly in early 2012 with record-breaking rainfall over spring 2012. Motivated by a series of “what-if” questions, we created downward counterfactual storylines of the 2010-12 drought to reimagine how the event could have turned out worse.
In our study, we created storylines quantifying what would happen if…
Hydrological preconditions of the drought were drier
Continued dry conditions persisted from a third dry winter instead of the observed rapid drought termination
The drought was to unfold in a warmer climate.
We showed that the 2010-12 drought was highly influenced by catchment preconditions. Storylines of drier preconditions showed that catchment preconditions prior to drought inception aggravated drought conditions for some of the most affected catchments. Progressively drier preconditions could have led to short but more intense conditions for fast responding catchments in Scotland and a lag and lengthening of drought conditions in slow responding catchments in lowland England.
The observed 2010-12 drought was characterized by two consecutive dry winters. Weather forecasts and water companies at the time widely anticipated dry conditions to continue through 2012. The prospect of three consecutive dry winters is a well-known concern in the water resources industry and can lead to significant reduction in reservoir storage. This is especially important for slow-responding catchments as groundwater reserves are normally recharged during winter. Storylines of the 2010-12 drought given an additional dry year with dry winter conditions either before or after the observed drought showed the vulnerability of catchments to a “three dry winters” situation. Figure 2 shows that drought conditions could still have intensified with even lower river flows for catchments that were already the most affected.
Applying the UKCP18 regional climate projections to the observed 2010-12 drought sequence, drought conditions are projected to worsen with temperature rise. Notably, the magnitude of change is lower for catchments in western Scotland due to the compensating effects of wetter winters in general although summer months are projected to become drier with temperature rise. Benchmark severe droughts such as the 1975-76 and the 1989-92 droughts are regularly used to test the feasibility of water management plans. Given a third dry winter or a >2°C temperature rise, the different counterfactual storylines of the 2010-12 drought could have led to worse conditions than both the selected benchmark droughts (Figure 3 for slow-responding catchments in southeast England relative to the 1989-92 drought).
Event storylines created from plausible alterations made to past observed droughts can help water resources planners stress test hydrological systems against unrealised droughts. “Bottom-up” approaches exploring specific conditions relevant to water resources planning (e.g. three dry winters) can complement traditional “top-down” projections to better understand worst-cases and consider how future extreme droughts can unfold.
Chan, W.C.H., Shepherd, T.G., Facer-Childs, K., Darch, G., Arnell, N.W., 2022a. Tracking the methodological evolution of climate change projections for UK river flows. Progress in Physical Geography: Earth and Environment 030913332210792. https://doi.org/10.1177/03091333221079201
Chan, W.C.H., Shepherd, T.G., Facer-Childs, K., Darch, G., Arnell, N.W., 2022b. Storylines of UK drought based on the 2010–2012 event. Hydrology and Earth System Sciences 26, 1755–1777. https://doi.org/10.5194/hess-26-1755-2022
Already an endangered species, the Asian Elephants (Elephas maximus) continue to be increasingly threatened by habitat degradation, poaching for ivory and conflicts with people (Sukumar 2003; Menon et al., 2017). India harbours 60% of the current Asian elephant population, but only 23% of its elephant habitats reside within protected zones while the rest are perpetually disturbed by escalating anthropogenic pressures (such as expansion of human settlements and agriculture, livestock grazing and fuelwood gathering) and economic activities (mining, construction of road-railway networks etc.). Habitat degradation contributes to increasing elephant encounters with people and triggering human-elephant conflict (HEC). The conflict scenario in India escalates day by day gaining in severity and frequency. In the four-year period between 2015 and 2018 alone, it had caused deaths of around 2,400 people and 490 elephants and annually, 0.5 million households suffered due to crop loss by elephant raiding from 2000 through 2010 (MOEF 2012; MoEF & CC, 2018). Elephants have the capacity to adapt to a mosaic of natural and modified habitats and their preference of habitat selection is often determined by the landscape composition as well as space and resource availability (such as vegetation and water). Thus, comprehension of elephants’ space-use with respect to their distribution is crucial for managing human-wildlife coexistence. We conducted our study on the space-use of elephants in the Keonjhar forest division in eastern India, where several hundreds of elephants have been killed as a result of electrocution, road-train mishaps, poaching and HEC.
Figure. 1: Pattern of estimated elephant occupancy, which was evaluated using the top model for occupancy probability. Keonjhar forest division has seven forest ranges (Barbil, Bhuiyan-Juang Pihra (BJP), Champua, Ghatgaon, Keonjhar, Patna and Telkoi). Five elephant habitat cores (light blue color polygon) were identified and named as CFR, KFR, BFR, GFR and TFR
We used a popular species distribution technique called occupancy modeling, which analyzed the histories of elephant presence or absence on the survey sites (MacKenzie et al. 2017) to estimate the probability of elephant presence and underlying driving factors. For occupancy modeling, we used elephant GPS location at different sites along with anthropogenic and environmental variables, including climate variables such as precipitation data derived from monthly rain gauge data and mean annual temperature from MODIS-MOD11A1. Sentinel-2A satellite images were very helpful for extracting variables such as forests, cropland and settlements.
We observed elephant occupancy in 43% of the study region (about 2710 km2) (Figure 1) and occupancy was found to be higher in the regions with over 40% open forest cover (Figure 2B). It is easy to believe that a mega herbivore species like the elephants would prefer dense forests with minimum anthropogenic disturbances. However, we were surprised to find that elephants were actually drawn towards forests in human dominated landscapes with multiple land-use activities, over relatively intact forests (Sitompul et al., 2013; Huang et al. 2019). Scrubs and grasses, which are a primary forage of elephants, can grow easily in open forests as they receive better space and light conditions. Thus, open forests are the strongest variable influencing elephant occupancy, which specifically plays an important role in providing food and shelter for elephants as well as in their thermoregulation.
Figure. 2: Relationships between elephant detectability and the influential covariates
Furthermore, train-vehicle collisions have been one of the major causes of elephant mortality through the years (Jha et al., 2014; Dasgupta & Ghosh, 2015), so we evidenced a lower elephant occupancy in the regions with denser transportation networks (Figure. 2F). Even though crops are not natural forage for elephants, they preferred crops over grazing on natural forage, due to higher accessibility, palatability and nutrition (Sukumar, 1990; Campos-Arceiz et al., 2008). Thus, elephant detectability near croplands was relatively high.
When it comes to climate variables, we found a positive influence of precipitation on elephant detection, which was contrary to a study conducted in an extremely wet landscape of Southern India, that found how precipitation was the least influential covariate. However, we believe that favourable rainfall conditions improved water availability, while also increasing the productivity of deciduous forests with an abundance of palatable trees (Kumar et al., 2010; Jathanna et al., 2015), which attracted more elephants to these regions in the study area. Therefore, it is reasonable that variations in precipitation will be immediately reflected in the elephants’ space-use as rain-driven vegetation can prompt highly opportunistic elephant movement patterns.
It is very challenging to demarcate exclusive regions for people and elephants within the varying landscapes of India where both human and elephant populations are high. However, owing to the presence of areas which are more frequently used by elephants such as the five habitat cores that we identified in our study (Figure 1), we can conclude that this region still has the potential to support a significant elephant population (Tripathy et al., 2021). Hence, for efficient landscape management and planning it is critical to understand the spatial factors that potentially influence the preference of space-use by elephants in this region which will in turn ensure peaceful coexistence between elephants and people while also facilitating elephant conservation strategies.
The National Centre for Earth Observation (NCEO) is a distributed NERC centre of over 100 scientists from UK universities and research organisations (https://www.nceo.ac.uk). Last month NCEO launched a new and exciting headquarters, the Leicester Space Park. After the launch, researchers from various institutions with affiliations to NCEO were invited to a forum at the new HQ. This was an introductory workshop in Machine Learning and Artificial Intelligence. We were both lucky enough to attend this in-person event (with the exception of a few remote speakers)!
As first year PhD students, we should probably introduce ourselves:
Laura – I am a Scenario student based in the Mathematics department, my project is ‘Assimilation of future ocean-current measurements from satellites’. This will involve applying data assimilation to assimilate ocean-current velocities in preparation for data from future satellites. My supervisor is also the training lead and co-director of Data Assimilation at NCEO. I was thrilled to be able to attend this forum to learn new techniques that can be used in earth observation.
Ieuan – I am a Scenario Associate based in the Meteorology department. My project is titled ‘Complex network approach to improve marine ecosystem modelling and data assimilation’. In my work, I hope to apply some complex-network-informed machine learning techniques to predict concentrations of the less well observable nutrients in the ocean, from the well observable quantities – such as phytoplankton! As a member and fundee of NCEO, I was excited to see a training event on offer that was highly relevant to my project.
Machine Learning (ML) and Artificial Intelligence (AI) are often thought of as intimidating and amorphous topics. This fog of misconceptions was quickly cleared up, however, as the workshop provided a brilliant, fascinating and well-structured introduction into how these fields can be leveraged in the context of earth observation.
Introduction to NCEO
The forum began bright and (very) early on Wednesday morning at the Leicester Space Park. Our first day of ML training began with an introduction to NCEO by director – John Remedios, and training lead – Amos Lawless. We each had the opportunity to introduce ourselves and our research in a quick two-minute presentation. This highlighted the variety in both background and experience in an entirely positive way! As well as benefiting directly from the training itself, we enjoyed being in a room full of enthusiastic people with knowledge and niches aplenty.
Introducing ourselves and our research
Next, we had a talk from Sebastian Hickman, a PhD student at the University of Cambridge, who introduced his work on using ML with satellite imagery to detect tall-tree-mortalities in the Amazonian rainforest. A second talk was given by Duncan Watson-Parris from the University of Oxford on using ML to identify ship tracks from satellite imagery. These initial talks immediately got us thinking about the different ways in which ML could be used within the realm of earth science. The second day we also had talks from ESA’s Φ-lab, on a whole host of different uses for AI in earth observation.
To begin our ML training, members of NEODAAS (NERC Earth Observation Data Acquisition and Analysis Service), David Moffat and Katie Awty-Carroll, led us through an introduction into ML and AI, its uses and importance in modern scientific context. The graphic below – presented by David and Katie – makes a digestible distinction between some commonly conflated terms in the subject area:
AI vs ML vs DL*
The discussion on the limitations and ‘bottlenecks’ of ML was of particular interest, it highlighted the numerous considerations to be made when developing an ML solution. For example, the subset of data used to train a model should ideally be representative of the entire system, avoiding or at least acknowledging the potential biases introduced by: human-preferences in selecting and filtering training data; the method of data collection method; the design of the ML techniques used; and how we interpret the outputs. While this may seem obvious at first, it is certainly not trivial. There are high-profile and hotly-debated examples of AI being used in the real-world where biases have led to significant human-affecting consequences. (https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology) (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6875681)
We were prompted to consider these ethical questions and the efficacy of ML in the context of earth science: Which problems does ML help us solve and, perhaps more importantly, which problems are we willing to entrust it with?
We then began the practical sessions which all fell within the broad umbrella of ML. This required a slight mindset shift from traditional programming as, even from a top-down perspective, the way we approached problems was completely different:
We were given jupyter notebooks to work on three separate practical’s; random forest classification, neural network for regression, and convolutional neural networks. Each showed a different application and use-case of ML, giving us more ideas on how it could potentially be implemented into our own research. Adjacent to this, we were given a workflow task to think about over the two days: how could we use ML in our own projects? At the end of the second day we each presented our ideas and were given feedback. This helped ground the talks with an ongoing focus to relate new knowledge back to our own varied fields; allowing the workshop to elegantly handle the variety and promote the actual use of the skills in our own work.
The forum was academically challenging but it was also great fun! Surrounding the concentrated days of learning, the forum offered us plenty of chances to connect with others. We were given a tour of the Space Park, an impressive space you could say was out of this world. The evening activities, bowling and shuffleboard, had a great atmosphere too!
By the end of the event, the interest and enthusiasm of the attendees had been effectively transformed into new understanding and conversation – which is unsurprising considering the increasing relevance ML is gaining in the field of earth science. Further to this, making connections over the pandemic has been difficult, so we felt extremely fortunate that we were able to meet in person.
Laura– The forum was an exciting insight into a field I had no experience in. Although my immediate work is focused on the application of data assimilation to ocean measurements, which does not directly relate to ML at the moment, data assimilation has high potential to overlap with ML .The forum has furthered my understanding of fields that surround the focal point of my research. In turn, this has helped me gain a more well-rounded knowledge base, opening doors to new directions my research could take.
Ieuan – The forum has certainly given me many new avenues to explore when approaching the intended application of ML in my work, perhaps starting simple with a neural network for multivariate regression and expanding from there. The hands-on practicals were a valuable opportunity for practice and a great chance for some informal discussion on the details of ML implementation with my peers. Moreover, the event has equipped us with the skills to effectively engage with other academics when they present ML-based work – which is something I would love to do in future events!
We both hope there will be more NCEO workshops like this in the future, perhaps an event or meetup that focuses on the intersection of ML and data assimilation, as these topics resonate with us both. We’d like to thank the NEODAAS staff from PML for leading the training and Uzma Saeed for organising the forum. It was a fun and engaging experience that we are grateful to have taken part in and we would encourage anyone with the opportunity to learn about ML to do so!
* Graphics were provided by the NEODAAS slides used at the NCEO forum
Due to lockdowns and travel restrictions since 2020, networking opportunities in science have been transformed. We can expect to see a mix of virtual and hybrid elements persist into the future, offering both cost-saving and carbon-saving benefits.
The MeteoXchange project aims to become a new platform for young atmospheric scientists from all over the world, providing networking opportunities and platforms for collaboration. The project is an initiative of German Federal Ministry of Education and Research, and research society Deutsche Forschungsgesellschaft. Events are conducted in English, and open to young scientists anywhere.
This year marked the first ever MeteoXchange conference, which took place online in March 2022. The ECS (early career scientists) conference took place over two days, on gather.town. An optional pre-conference event gave the opportunity for new presenters to work on presentation skills and receive feedback at the end of the main conference.
Five presenter sessions were split over two days, with young scientists sharing their research to a conference hall on the virtual platform gather.town. Topics ranged from lidar sensing and reanalysis datasets, to cloud micro-physics and UV radiation health impacts. I really enjoyed talks on the attribution of ‘fire weather’ to climate change, and machine learning techniques for thunderstorm forecasting! The first evening concluded with a screening of documentary Picture a Scientist.
During the poster session on the second day, I presented my research poster to different scientists walking by my virtual poster board. Posters were designed to mimic the large A2 printouts seen at in-person events. Two posters that really stood out were a quantification of SO2 emissions from Kilauea volcano in Hawaii, and an evaluation of air quality in Cuba’s Mariel Bay using meteorological diagnostic models combined with air dispersion modelling.
Anticipating that it might be hard to communicate on the day, I added a lot of text to my poster. However, I needn’t have worried as the virtual platform worked flawlessly for conducting poster Q&A – the next time I present on a similar platform I will try to avoid using as much text and instead focus on a more traditional layout!
By the conference end, I got the impression that everyone had really enjoyed the event! Awards were given for the winners of the best posters and talks. The ECS conference was fantastically well organised by Carola Detring (DWD) and Philipp Joppe (JGU Mainz), and a wonderful opportunity to meet researchers from around the world.
Since July 2021, MeteoXchange have held monthly meetups, predominantly featuring lecturers and professors who introduce research at their institute for early career scientists in search of opportunities!
The opportunities shared at MeteoMeets are complemented by joblists and by the MeteoMap: https://www.meteoxchange.de/meteomap. The MeteoMap lists PhD and postdoc positions across Germany, neatly displayed with different markers depending on the type of institute. This resource is currently still under construction.
One of the most exciting aspects of the MeteoXchange project is the opportunity for international collaboration with travel grants!
The travel funds offered by MeteoXchange are for two or more early career scientists in the field of atmospheric sciences. Students must propose a collaborative project, which aims to spark future work and networking between their own institutions. If the application is successful, students have the opportunity to access 2,500€ for travel funds.
Over the last two weeks of April, I will be collaborating with KIT student Fabian Mockert on “Dunkelflauten” (periods of low-renewable energy production, or “dark wind lulls”). Dunkelflauten, especially cold ones, result in high electricity load on national transmission networks, leading to high costs and potentially cause a failure of a fully renewable power system doi.org/10.1038/nclimate3338. We are collaborating to use power system modelling to better understand how this stress manifests itself. Fabian will spend two weeks visiting the University of Reading campus, meeting with students and researchers from across the department.
The 2022 travel grant deadline has already closed; however, it is hoped that MeteoXchange will receive funding to continue this project into future years, supporting young researchers in collaboration and idea-exchange.
To get involved with the MeteoMeets, and stay up to date on MeteoXchange related opportunities, signup to the mailing list!