SHORT COURSES AND WORKSHOPS
REGISTRATION OPENS JUNE 8, 2026
The short courses and workshops registration system will open June 8, 2026. Please note that seating is limited and will be assigned on a first come, first served basis.
Phytoplankton show themselves during the spring bloom in the North Sea. This Landsat 8 image of an eddy in the North Sea was collected on May 14, 2018. From https://oceancolor.gsfc.nasa.gov/gallery/565/
Machine Learning and Aquatic Remote Sensing
WHEN: Sunday, September 13, 09:00 – 13:00
WHERE: Wintercircus Auditorium & Foyer
LED BY: Roy El Hourany (LOG, Université du Littoral d’Opale) and Mortimer Werther (Eawag)
This workshop brings together machine learning and aquatic remote sensing (open ocean, inland, and coastal waters). The workshop is split into three parts and will last approximately 3 hours (including a break).
Part A will start with a concise “flash course” introducing key machine learning concepts and algorithms commonly used in this field. Participants will work with satellite and in-situ datasets centered on the retrieval of phytoplankton pigments and community structure from space using remote-sensing reflectance and contextual datasets. Through this exercise, they will learn how to prepare and harmonize the datasets, build and validate models, and interpret the results. Part A will take about 1 1/2 hours.
Part B builds on the models and concepts introduced in Part A to extend the analysis to model evaluation. It examines how models generalize beyond the data seen during training, how to quantify estimation uncertainty, and how to assess and improve calibration so that estimated values correspond accurately to observed outcomes. We will explicitly cover how methodological choices (e.g., training strategy, validation design) influence reported model performance, and how we can provide a measure of uncertainty during applications to satellite imagery. We will also demonstrate how to report these elements in a transparent and reproducible manner. Part B will last about 1 hour.
Part C will focus on distinguishing machine learning from artificial intelligence (AI) in this field. This part will provide participants a brief review of current research topics and trends, and examines how AI may influence the field in the future. This last part is designed to take approximately 30 minutes.
For Part A and B programming knowledge is highly recommended and appreciated, but not needed to participate. The course materials will be provided in Python (open source). Only Part A will also be available in MATLAB. Each participant should bring their own device (e.g., laptop) to follow along. The Jupyter notebooks will also be provided for the session for you to access the datasets and all involved code (and, if possible, shared in advance as pre-workshop resources). Setting up a Jupyter notebook (https://jupyter.org/install) is required before the beginning of the workshop (contact us beforehand if you require assistance).
Ship-Based Underway Optical Measurements: Best Practices for Collection and Processing
WHEN: Sunday, September 13, 09:00 – 13:00
WHERE: Wintercircus Expo #2
LED BY: Nils Haëntjens and Guillaume Bourdin (University of Maine)
This short course will:
- Review best practices for building an underway system, including suggestions of hardware.
- Introduce the open-source software Inlinino to log data from instruments.
- Introduce the open-source software InlineAnalysis to process and quality control data.
- Latest corrections for ACS in flow-through mode: scattering correction and CDOM interpolation.
SHORT COURSE
Working with Aquaverse and STREAM: Atmospheric Correction, Water Quality, and Uncertainty Products for Inland and Coastal Waters
WHEN: Sunday, September 13, 09:00 – 13:00
WHERE: Wintercircus Catacombs
LED BY: Ryan E. O’Shea, Arun M. Saranathan, Akash Ashapure, and Will Wainwright
(Science Systems and Applications, Inc., and NASA Goddard Space Flight Center)
This course will cover the fundamentals of working with Aquaverse, a machine-learning-centered processing workflow to generate downstream products from remote sensing observations. Aquaverse is an aquatic inversion scheme for the estimation of biogeochemical parameters and inherent optical properties, as well as their associated prediction uncertainties, from remotely sensed optical data over inland and coastal waters. Pre-processed chlorophyll-a, total suspended solids, and Secchi disk depth products are available from Sentinel 2’s Multispectral Instrument (MSI) and Landsat-8/9’s Operational Land Imager (OLI/OLI-2) for CONUS and select regions around the globe via our web interface https://ladsweb.modaps.eosdis.nasa.gov/stream/. Pre-processed products will be available for download, for global inland and coastal waters, from the Ocean Color Instrument (OCI) aboard the PACE mission. Alternatively, similar water quality products can also be estimated for most multi- and hyperspectral missions, including Sentinel 2’s MSI, Landsat-8/9’s OLI/OLI-2, Sentinel-3’s Ocean Land Colour Instrument (OLCI), the Earth Surface Mineral Dust Source Investigation (EMIT), Planet’s SuperDove, the Moderate Resolution Imaging Spectroradiometer (MODIS), the Visible Infrared Imaging Radiometer Suite (VIIRS), Medium Resolution Imaging Spectrometer (MERIS), the Hyperspectral Imager for the Coastal Ocean (HICO), PACE’s OCI, and the PRecursore IperSpettrale della Missione Applicative (PRISMA), via user-installable machine learning models. Aquaverse machine learning models for atmospheric correction of MSI and OLI imagery are also supported via these tutorials.
First, the course will introduce use cases of Aquaverse products via maps, scatterplots, and timeseries covering dynamic inland and coastal regions, including Lake Erie, the Chesapeake Bay, and small ecosystems across Europe. Second, the course will lead users through downloading, loading, and timeseries products of the available online products. Third, we will demonstrate installation and example scripts for the user-installable machine-learning model, termed Mixture Density Networks (MDN), intended for AC and WQP retrieval from supported multi- and hyperspectral sensors. Finally, there will be time for users to interact with course instructors to (1) learn the fundamentals of MDNs, (2) install the MDN, (3) run example scripts, and (4) discuss potential projects, applications, and user needs. Users are encouraged to bring their own laptops to the workshop to install the MDN toolbox (available at https://github.com/STREAM-RS/STREAM-RS).
SHORT COURSE
SeaDAS: An In-Depth Overview with a Focus on PACE Data
WHEN: Sunday, September 13, 13:30 – 17:30
WHERE: Wintercircus Auditorium & Foyer
LED BY: Daniel Knowles and Aynur Abdurazik (NASA)
INTENDED AUDIENCE: Scientists and data users interested in understanding SeaDAS capabilities and PACE data workflows, including those new to PACE or evaluating SeaDAS.
This course focuses on how to use SeaDAS and its science processors for the visualization, analysis, and processing of satellite data. This course is fast paced and looks deeply into many of the SeaDAS tools. Due to the time limitation and the amount of material covered, this course is not intended as a follow along workshop, and laptops are not required. Material will be available with instructions regarding the case studies. This course does not focus on installation of SeaDAS, installation of the science processors, system requirements, or platform-dependent configuration.
TOPICS:
- PACE OCI Data: load and understand PACE OCI data.
- Visualization Overview: addresses color palettes, layers, vectors, pins, true color imagery, preferences, and other tools.
- PACE OCI Spectral Analysis: case study which demonstrates the use of spectrum view, pins, band math, L2merge, masking, statistics, and pixel extraction tools.
- The NASA SeaDAS Science Processors: overview and case study which demonstrates the use of the L2gen, L2bin and L3Mapgen processors (both in GUI and at command-line). Time permitting, a brief discussion of atmospheric correction in L2gen (PACE OCI–specific) may be included.
- PACE HARP2 Angular Analysis: case study which demonstrates use of angular view, animation, land mask, and soft button tools.
- System Performance Optimization: a discussion on optimization of system performance through the management of VM memory parameters, cache and tile sizes and file compression.
SHORT COURSE
Polarization Optics in the Aquatic Environment
WHEN: Sunday, September 13, 13:30 – 17:30
WHERE: Wintercircus Expo #2
LED BY: Tristan Harmel (Magellium), Olivier Burggraaff (NPL), Robert Foster (NRL), and Amir Ibrahim (NASA)
As an electromagnetic wave, light can be defined by its amplitude (intensity), frequency (wavelength), and state of polarization. Each of these components provides important insights on the interaction of light with matter. In the natural world, certain species of insects and crustaceans can sense polarization, while other animals utilize it for camouflage. Within aquatic environments, polarization is everywhere—from atmospheric skylight to in-water radiation, to reflection and refraction by the air-sea interface.
This course attempts to give a brief overview of recent developments on the use of polarization for aquatic environment monitoring including assessment of aerosols, atmospheric correction, reflection and transmission at the water surface, distinct signature of optically active water constituents. After a short historical review of the successive discoveries that punctuated our understanding of light polarization in the aquatic environment, the following topics will be addressed:
- Basic polarization theory in light scattering and radiative transfer
- Ways that polarization is generated in atmosphere-water systems (scattering, Umov effect, reflection/refraction)
- Modeling of polarization inherent properties through the scattering (Mueller) matrix of phytoplankton, suspended sediments, and other artificial matters (micro-plastics)
- Polarization to help disentangle elastic and inelastic (e.g., fluorescence) light contribution
- Correction of reflected light for above-water radiometry
- Simulations from the water column to the top-of-atmosphere level with vector radiative transfer model (e.g., OSOAA)
- Visualization of the polarization parameters from in-situ measurements (e.g., polarimetric cameras)
- Download and use of PACE polarimeter data, comparison with OSOAA model and/or measurement data from cruises/airborne instruments
Skills in Python programming and use of Jupyter notebooks will be beneficial to practical examples.
SHORT COURSE
Introduction to Lidar for Ocean Color Applications
WHEN: Sunday, September 13, 13:30 – 17:30
WHERE: Wintercircus Catacombs
LED BY: Cédric Jamet (LOG, Université du Littoral d’Opale), Davide Dionisi (Istituto di Scienze Marine, ISMAR-CNR), Kelsey Bisson (NASA), and Brian Collister (NASA)
Whether you care about LiDAR, LiDAR cares about you! With a number of planned investments across space agencies, LiDAR technologies are posed to become our next frontier for foundational ocean science and discovery. Passive remote sensing of the ocean color fundamentally changed our vision of the distribution of the phytoplankton and other optically active constituents. However, these observations have limitations that can be overcome using the active remote sensing technique called lidar (light detection and ranging). This technique has led to many ocean discoveries despite not having an ocean-optimized lidar satellite in orbit, including global images of phytoplankton from space-borne lidar for the first time in 2013.
Since then, oceanic applications using lidar have developed at a high speed for the detection of scattering layers or the seawater’s inherent optical properties and biogeochemical parameters over the vertical up to 60 meters from airplanes or ships of opportunities.
To increase lidar literacy for our ocean community, it is necessary to provide comprehensive background and training on the technique but also on the ways to process the lidar signal. This is the aim of this course. Fundamentals of the lidar will be provided followed by examples of oceanic applications. A practical exercise will teach participants how to process airborne and satellite lidar data for ocean properties. The content of the course is:
- Introductions and fundamentals of lidar (1 hr): description of the instrument, lidar equation, description of the different types of lidar, description of lidar algorithms
- Oceanic applications (15 mins): airborne, shipborne, and space-borne; scattering layers, estimation of chl-a and POC, zooplankton; polar regions; profiles of IOPs
- Break for participants (15 min)
- Practical exercise (2 hr 30 mins): Theoretical exercise to build lidar intuition about the challenges of photon counting; data processing of spaceborne (ICESat-2) and airborne lidar data for the estimation of profiles of IOPs and Chl-a. Where do we find the data? What do they look like? What are the issues to deal with? Which algorithms make sense to use and when?
The practical part requires a laptop and programming skills. Most of the codes will be in python.
Questions?
Contact Jenny Ramarui,
Conference Coordinator,
at [email protected]
or (1) 301-251-7708




