Description
#72 introduced a new "GC Uploader" Windmill app to handle Locus Maps exports exclusively.
However, we know that this is not the only type of file that our users will want to upload. Some other ones that we've already heard about include:
- Mapeo or CoMapeo exports (GeoJSON, or ZIP containing GeoJSON and attachments).
- KoboToolbox CSV/XLS export that was manually cleaned up by a user in Excel.
- Compressed files containing Esri Shapefile data.
- Compressed files with Timelapse template and annotation data, including media files.
- CSV or GeoJSON data from other sources.
Rather than creating a separate "GC Uploader" app for each file type, we could adapt the existing application to recognize different formats and schemas, triggering the appropriate connector script automatically.
Perhaps more ambitiously, in the same scope of work we might consider consolidating two or more connector scripts that are broadly similar in flow, and only differ in relation to data structures.
The user experience could be roughly similar to that of Felt's "Upload Anything", providing a single entry point to handle many and diverse file formats that our users want to store in GuardianConnector.
This issue will focus on technical scoping to determine the best approach in Windmill for extending the GC Uploader to support multiple file formats. The issue will be closed once we have converged on a clear approach, at which point implementation work can be scoped and planned separately.
TODO:
- Consider best approach to error handling for utility scripts used by this future tool, and return requirements in the WIndmill app.
A previously compiled user story
Justification:
Many Indigenous communities already have some data of their own that was created in previous projects, or may work with tools that are not directly part of the GuardianConnector toolkit. For example, they may have GPX data from working with a Garmin GPS, spreadsheet data from working with Excel, or Shapefiles from working with Esri or QGIS.
In addition, most Indigenous communities do not have access to someone, whether a member of the community or an external ally, that knows how to manipulate SQL directly. (In Rudo’s experience, this is a rare skill even among professional NGOs that might have GIS staff.)
The data warehouse is intended to be a centralized hub for all of the community’s data. Hence, communities should be able to upload their external data without needing to work with SQL, or requiring the help of CMI (or other) technical team.
User story:
As an Indigenous user of the Biocultural Monitoring System that does not know how to use SQL,
I want to upload my external CSV, XLS, SHP, GeoJSON, KML, or GPX data to the data warehouse,
So that I can have all of my data stored in one centralized, secure place, and so I can use it in my visualizations.
Acceptance criteria:
- There is a page I can navigate to in my private dashboard.
- The page has a clear ‘Select or create table’ button on the UI.
- Clicking the button opens up an interface where I can explore my SQL database tables (in the
public
db or similar). - I can select an existing database table, or create a new one.
- The page also has a clear ‘Upload’ button on the UI.
- File types are clearly indicated (e.g. KMZ, shapefile).
- A progress bar shows the upload status.
- A success message appears once the upload is complete.
- The tool creates any columns that do not exist (as TEXT)
- The tool imports all of the data in the new or existing columns.