SkyWatch mentored projects - ESA Summer of Code in Space

ESA Summer of Code in Space (SOCIS) is a program run by the European Space Agency that focuses on bringing student developers into open source software development for space applications. Students work with a mentoring organization (with potential support from ESA experts) on a 3 month programming project during their summer break.

The focus of this year’s SOCIS is on Artificial Intelligence and Earth Observation though all space related open source projects are eligible.

How to apply?

Students apply on the Summer of Code in Space website for the project that they want to work on before May 4, 2019. Note that a student can submit several applications, but only one may be selected. SkyWatch has currently been selected to mentor the following four projects. The projects need to be completed within 3 months, between June and September. Students can choose to work at the SkyWatch head office, in Waterloo, or remotely.

Our projects

3D Satellite Collection Animation


Combine collected and predicted future EO collection thumbnails into a 3d animation that tracks selected constellations and their data collections within a time series.


There are a variety of open source tools that can be configured to simulate the historical and future orbital passes of various constellations. Given a GeoJSON input of historical and future collections, the goal of this project is to a) deliver a unique time based animation that follows a given satellite’s orbit and animates it’s collections, b) allows the user to input a date range to restrict the animation, and c) has the ability to pull in provided thumbnails for collections and display them on the animation.

The solution should allow for the user to select the mission they wish to track, and then display results for that mission (which will be provided in GeoJSON) and well as “fly into the future” to show where/when future collections will (these will also be provided). The solution should be independent of sensor type but spatially accurate in both orbit and collection area.

Possible tools to deliver this animation include (but are not limited to) Cesium, VTS, iTowns, or a custom built solution.

NLP EO Data Search


Create the ability to search and retrieve EO sources via natural language. For example "find all images between Jan. 1 and Jan. 10 of Mt. Everest from Sentinel 2".


Every day, massive amounts of earth observation data is collected. However, finding data across different formats, providers, and sensor types is challenging. This project requires an innovative natural language processing (NLP) solution to searching earth observation archives with various degrees of abstraction. The resulting artificial intelligence will open new use cases for EO data.

The solution should three levels of abstraction. The first should deliver a list of results from a provided catalog that allows a user to specify sources, dates, and a general location, ie. "find all images between Jan. 1 and Jan. 10 of Mt. Everest from Sentinel 2". The second layer of abstraction should allow the user to describe their most desired result, ie. “find the most cloud-free image with the lowest sun angle of Mt. Everest in the last two years”. The third layer of abstraction should incorporate additional derivation from available spectral bands, ie. “what percentage of my farm is growing well this week?”. Other outputs and examples are open for discussion.

A catalog of sample data will be provided. Leveraging existing voice to text services (like Amazon Transcribe) is encouraged. Amazon Alexa integration would be a bonus.

Advanced Cloud Detection


Commercial satellite imagery has varying degrees of cloud mask quality. We believe ML/ AI can be combined with external data sources (weather, basemaps, etc) to advance cloud detection capabilities.


There are many research papers and dissertations written on detecting clouds in a raster image. In some cases, spectral bands are available that aid in classification. In others, there is less information available.

This challenge is relatively open ended - find a unique way to leverage alternate data sources to combine with temporal optical imagery to more accurately detect clouds.

A catalog of sample data will be provided.

Location reverse image search


Given an unreferenced image, predict the location it represents. This is a project that's been requested by law enforcement to help them connect images in their possession to suspects and locations.


Geo-referencing earth observation is normally simple when products contain metadata that defines their location. However, what if you had an image that didn’t have any geo-referencing information? Would it be possible to determine where it was from?

We believe it is possible through a mix of computer vision, machine learning, and artificial intelligence.