You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
dsotirho-ucsc opened this issue
Jan 22, 2025
· 4 comments
Assignees
Labels
-[priority] Mediumdata[subject] Data or metadata [use of this label is uncommon]debt[type] A defect incurring continued engineering costenh[type] New feature or requestorange[process] Done by the Azul teamspike:2[process] Spike estimate of two points
dsotirho-ucsc
added
enh
[type] New feature or request
data
[subject] Data or metadata [use of this label is uncommon]
debt
[type] A defect incurring continued engineering cost
-
[priority] Medium
spike:2
[process] Spike estimate of two points
labels
Jan 22, 2025
I think we’ve discussed this one a little bit in the past. We don’t have the capability currently to mirror data across environments, nor do we have the ability to move data from anywhere in the production pipeline to the development pipeline. Our only option to accomplish this would be to have the wranglers re-state all of the datasets and process them through dev
@hannes-ucsc: "The Broad indicates that they cannot copy snapshots from prod to dev, and that the only recourse is to provide staging areas for them to import into TDR dev. Assignee to coordinate with the involved parties (Broad & LungMAP) in order to create a mirror of catalog lm8 in dev. This will require creating or reactivating the staging areas that were imported into TDR prod for lm8."
-[priority] Mediumdata[subject] Data or metadata [use of this label is uncommon]debt[type] A defect incurring continued engineering costenh[type] New feature or requestorange[process] Done by the Azul teamspike:2[process] Spike estimate of two points
The following LungMAP snapshots will be removed from
dev
as part of #6516 (PR #6747) due to invalid metadata schema locations:These snapshots should be replaced with updated snapshots from prod.
The text was updated successfully, but these errors were encountered: