Philosophy of STAC
The genesis of STAC was grounded in a few simple philosophies that orient the community processes and direction. They are designed to help focus the effort, putting in some guardrails so we don't keep revisiting first principles. The full principles are on github, but the main ones are:
- Small, flexible core - STAC aims to be easy to implement, and adaptable to existing implementations. STAC validation enables maximum flexibility by simply checking for the presence of key fields, allowing other implementation specific fields to be consumed by STAC clients as required. All parts of STAC should be small pieces, loosely coupled, to enable agile evolution.
- Evolve best practices and extensions through real world use - While the STAC Core focuses on minimal specification, optional extensions are continually added to increase interoperability and usability of data exposed via STAC. These will arise from concrete STAC implementations, and may also push the core towards more flexibility.
- Heavy use of Links - Links between various items enable modeling of complex relationships. Links allow a client to traverse relationships across different STAC implementations. For example, a derived NDVI product produced by Company A should link to the corresponding surface reflectance data provided by Company B.
- HTML representations - It is important that we always consider both the machine readable JSON and the human-usable HTML. STAC Item's are designed to easily be transformed into HTML that can be browsed by humans and crawled by search engines, thus integrating with the web.
Visions for STAC
STAC's evolution will be entirely dictated by whatever direction comes from the community that follows the core principles above. But there a few visions for the future that we hope to guide the spec towards.
Geospatial Metadata Done Right for the Cloud:
STAC aspires to be a key specification of the Cloud Native Geospatial future. There is an opportunity to get metadata 'right' with the new Cloud Native Geoprocessing engines that are emerging, as they can be built to use and create the right metadata for any information they consume or produce. Our hope is that STAC is early enough in the cloud native geospatial evolution that it can enable cross-platform interoperability from the beginning.
STAC complements the idea of Cloud Native Native Geospatial, an example of which is the Cloud Optimized GeoTIFF (COG). However COG’s by themselves don’t have enough information to be crawled and indexed in any meaningful way without some other structure providing links and additional information.
Pairing a COG with a STAC Item provides all the additional metadata to make it useful, and the link structure of the Catalog ensures that search engines will be able to crawl it. By using HTML to render STAC Items, every COG can have an online location that users can interact with and can discover via search engines. Though COG is the ideal pair for a STAC Item, STAC can be used with any format - the only requirement is that it is an asset that can be downloaded. Our goal is to enable legacy data to be more easily exposed on the cloud, and to also provide a stable wrapper that can work with any new cloud native format.
Simply putting data on the cloud is not sufficient — it must be discoverable and useable. STAC offers a simple and reliable way to make data more accessible.
Tracking Provenance Across Catalogs
Creating a true Cloud Native Geospatial ecosystem requires not just provenance tracking within one cloud platform, but across all platforms. An algorithm might access Orthorectified Satellite imagery from a data provider via STAC, then apply surface reflectance, run an NDVI to assess plant health, or create a land cover classification. If the resulting outputs are also distributed with own STAC then the data can refer to the parent STAC Items to allow provenance tracking across platforms.
Tracking provenance to source has been a dream of most consumers of geospatial assets, but the shift to the cloud enables an opportunity to realize that dream. One can imagine a number of additional benefits from it, like enabling things like the geospatial equivalent of Google's PageRank algorithm. We would be able to track the popularity and relevance of fundamental datasets and assets by following their derivative products back, enabling a truer understanding of the usage of any geospatial data.
Flipping the Geospatial Search Paradigm
While everyone is familiar with Google and being able to search across every web page in the world, this is still not possible for geospatial data. The vision for STAC is for global search is not to send search requests to every STAC server, but to just make every item ‘web-crawlable’.
Provenance links between catalogs contribute to discoverability by boosting a page’s ranking within search engine algorithms. Stable HTML means more links to those pages, which will push them higher in search results.
STACs create a foundation for geospatial search innovation by standardizing metadata for imagery, promoting other Cloud Native Geospatial technologies like COG, and encouraging HTML pages with lots of links for every single geospatial image.
STAC hopes to evolve to help standardize notifications and alerting, to enable multiple catalogs to easily stay in sync by subscribing to their source. Search engines and STAC API's will be able to stay up to date with the latest assets. And offline catalogs would be able to easily ingest the relevant metadata and assets when they get updated.
Advance the Geospatial Standards Conversation and Processes
Perhaps a less lofty goal, but one of the visions of STAC is to help advance the standards world as a whole, particularly in the geospatial domain. This means trying out new ways of collaborating means and being open with all our learnings, practices and tools. It also means we seek to align with other next generation standards efforts, to create a solid, coherent baseline of accessible standards. The community aims to be as collaborative as possible with other efforts, aligning as much as possible whereever it makes sense. We have been working with the Web Feature Service 3.0 group in the Open Geospatial Consortium, and have goals to interoperate with schema.org, DCAT, the NetCDF community, ISO and W3C.