Skip to content

Toolkit for batch management of data in DuraMAT Datahub (like upload and delete)

License

Notifications You must be signed in to change notification settings

lbj2011/DatahubTool

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DatahubTool

PyPI - Version DOI

Toolkit for batch management of data in DuraMAT Datahub, like upload and delete.

NOTE: only authorised users (API Key required, shown on user page of Datahub) can manage its owned project data in Datahub.

Installation

pip install datahubtool

Package overview

Here is a high level overview of the important functions of the package.

  • 'run_upload_pipeline': Pipeline to realize the data upload
  • 'get_local_file_names': Get the names of local files to upload in a given path
  • 'get_Datahub_file_names': Get all existing files's name in a given package in Datahub
  • 'upload_files': Upload files to Datahub
  • 'delete_Datahub_files': Delete files in Datahub

Example of upload files

from DatahubTool import run_upload_pipeline

headers = {'Authorization': [your API key]}
folder_path = 'Path of the folder to upload'
datahub_package_name = 'Datahub package name'
file_format = '.csv'

run_upload_pipeline(folder_path, datahub_package_name, headers, 
                    file_format = file_format)

When complete, it will print:

2/2 file(s) uploaded on 2023-11-17 14:20:24

Authors

Baojie Li (LBNL)

About

Toolkit for batch management of data in DuraMAT Datahub (like upload and delete)

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages