DASH is October 9-11, 2023 at JHU/APL in Laurel MD
(co-located with the IHDEA meeting, Oct 12-13, see https://ihdea.net/events.html)
Over the course of the last few years, few technologies have become as ubiquitous as cloud technologies. The building blocks of this revolution are the containers using tools such as Docker and orchestration tools such as Kubernetes and Docker Swarm. Furthermore, data volumes are growing out of control and although possible to store the data locally, it is getting increasingly unwieldy. Storing data on the cloud makes sense, but getting software to work with the data on the cloud without having to download will work best with the technology of containerization.
There are many ways to make use of the cloud, and they all have their own challenges. Specifically, legacy software needs to be retrofitted for cloud usage and other tools might be developed from the ground up to make use of cloud technologies.
This session offers a forum where we talk about data analysis tools in the cloud and hear about what challenges people have had and what they’ve learned from solving them.
Modern observing strategies are producing both higher volume and higher dimensional data than historically. This trend will likely continue and potentially accelerate in the future. Consequently, ever more complicated data reduction pipelines are required to calibrate the data and produce quality data products that help answer scientific questions. An effective pipeline balances many goals such as time and resource efficiency, traceability and transparency, safety and security, longevity, testability, and easy usability by other scientific teams to produce necessary non-standard data products. This session explores how we can utilize modern software practices and tools to build successful pipelines. We invite presentations from current and upcoming missions to discuss their designs. We also welcome contributions from past missions as an experience-based guide for what does and does not work well.
Accessing data can be a burden and even a barrier to scientific progress. After discovering a dataset of interest there often remains the difficulty of handling various data formats and reading the data. Numerous capabilities (HAPI, TAP, VSO, CDAWeb, and others) have arisen to help alleviate this problem. Some have specialized interfaces while others are building on standard APIs.
The goal of this session is to highlight the various programmatic data access tools in the Heliophysics community and to explore where we can evolve towards better interoperability.
Research publications utilizing machine-learning (ML), and more recently deep-learning, techniques in the last 10 years are following an exponential growth. For each problem which is attempted to be solved, a new one emerges and highlights the need for updated algorithms, techniques, and/or datasets. In many cases, the spatial and temporal resolution of the data, possible data gaps, combination of time series with image data, etc. reveals that new datasets, i.e. new instruments and missions, are needed.
In this session we aim to provide an overview of the current state of various space weather (SWx) datasets as well as algorithms and techniques for data processing. We also seek to discuss the development of benchmarking datasets which will serve in the future as a common "input" for validation analysis and how combining physics with ML algorithms can lead to more efficient space weather forecasts. Our goal is to define guidelines for the future of ML/SWx modeling and forecasting. Thus, we solicit abstracts to present and discuss data processing, analysis, modeling and forecasting in Space Weather and Heliophysics.
implied topics: boundaries between theory, modeling, and operations
This session explores the use of computational notebooks, such as Jupyter and other notebook formats, as well as the software and infrastructure to support their usage. The focus will be on how these tools can foster collaboration, increase research efficiency, and promote reproducibility. We solicit presentations covering several themes, including:
The Python in Heliophysics Community (PyHC.org) encourages software developers to collaborate together to demonstrate the use of their software packages with other software packages, especially with full interoperability between the packages (e.g. the given data is passed from one package's data format to another and back at least once in the analysis process) in alignment with PyHC standards. This session seeks abstracts demonstrating effort towards or achievement of this interoperability between Python packages. Talks should include an outline of what challenges were overcome to achieve interoperability, or what challenges blocked that goal. The expected presentation format will be a few slides followed by a live demonstration of the work in a notebook or other executable and interactive environment (e.g. Binder, DeepNote, etc) highlighting the cross or inter-package use. Abstracts using one or more PyHC software packages with at least one package not in PyHC are encouraged for these presentations.
This session will also have an 'unconference' portion (a.k.a. an open discussion following a few introductory presentations), focusing specifically on current methods for coordinate systems and conversions. While standardizing around a common framework such as AstroPy would increase interoperability and reduce duplication of effort in the long run, it comes at the cost of significant refactoring efforts and potential breaking changes. We will discuss strategies for overcoming these barriers, transitioning packages to a common framework and cross-checking results from bespoke implementations and common frameworks. Discussion will take place with the aid of modern technology (e.g. a Miro board, https://miro.com/) to produce a record of the discussion and a roadmap for future development.
The Heliophysics community is moving towards routine release of software as open source, going beyond "software as software" efforts to include public availability of research-specific and task-specific code. In the context of this trend, recently-adopted NASA Heliophysics Division and Science Mission Directorate policies require all SMD-funded software to be released as open source. This session discusses approaches to "open source as default", whether by requirement or voluntarily. All participants developing task-oriented software, such as for a specific research study or operational need, are welcome; those contributing to missions are particularly invited.
"Data portals" (also known as science gateways, virtual observatories, etc.) are web-based platforms that allow users to access a wide range of data resources, including datasets, metadata, and tools for data analysis and visualization. Data portals are critical for promoting open data and enabling researchers to find, access, and analyze the data they need. The overall goal of this session is to bring together researchers, data managers, and software developers working on data portals in Heliophysics research to exchange ideas, showcase their work, and discuss the challenges and opportunities in this rapidly evolving field.
We invite presentations that showcase data portals, the resources used to develop them, and the use cases they have for advancing open Heliophysics research. Topics could include portal architecture and design decisions, innovative portal capabilities, data management and curation techniques, usability considerations, and portal interoperability.
This DASH session will focus around the recent development of community-oriented methods and software tools for forward synthesis and inverting plasma and magnetic information in the solar corona. Such toolsets are important contributions to heliospheric research by helping predict variations in the space environment through understanding the magnetic interactions in the solar Corona. We aim to support interpreting novel observations from new and future generation instrumentation from both ground and space. DKIST and uCoMP ground observations will revolutionize our understanding of the corona and magnetism at the base of the heliosphere. In turn, this will help with our more limited understanding of the magnetic connectivity origins of the solar wind at the interface between the sun and the heliosphere, prompted by the observations of the Parker Solar Probe (PSP), which are actively helping to close this gap.
This gathering is open to all coronal physics enthusiasts, and more particularly to scientists and engineers working on methods and software tools to be used for interpreting coronal observations. Most community efforts and public tools are individual contributions that cross multiple areas of research. This gathering aims to promote cohesion and guide our efforts towards a common goal, by exploring gaps and avenues for synergies in community available tools and methods dedicated to coronal science.
FORMAT: Short elevator speeches (max 15 min total) for people who work on methods to present themselves and their interest + Unconference (60 min total):
The Unconference format will have registered participants voting in polls for topics before the meeting. First, topic proposals will be requested, and then the participant community can rank the proposed topics. The session itself will run in one of the two below ways based on the broadness or specificness of topics.