-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NeuroBagel roadmap #2807
Comments
Yes, just a couple comments.
Linking out to the query page is simpler. If we want to add it to OpenNeuro search, we would need to extend our search schema and GraphQL snapshot type with a namespace for this and add UI to set the search parameters in that namespace. I think we agreed we only wanted to include OpenNeuro dataset results in this listing on the OpenNeuro display side?
This could be a page on OpenNeuro that NeuroBagel links to with a URL back to NeuroBagel as a parameter, letting the user preview the changes and upload those files with their OpenNeuro session. |
Yes, this is what I also wrote down.
Thanks, this is very helpful. We're currently thinking that we will
That will work! We still have some minor issues to iron out but this should be something we can do soon. Do you have any preferences for what users should see as a result of their query?
I may be misunderstanding, but the NeuroBagel annotation tool will work like the BIDS validator: everything happens on the users machine, and the tool itself won't have to talk to any remote backend (for now). So the end of the process is that you can download the data dictionary you created back to your local filesystem. Or do you mean another kind of uploader?
Very happy to. Our docs are little out of sync at the moment, the most up to date form of what we create is here: https://github.com/neurobagel/bagel-cli/blob/f6e22d85a1536b50e815a5f9199f63ca58a8b06f/bagel/dictionary_models.py. This is meant to be a temporary format, so we're happy to replace it with another temporary format that is more likely to be compatible with the BIDS spec while we work on a more formal proposal. I'm assuming that would be done as part of a BEP?
🚀 |
@effigies @nellh |
Hey folks, Our main update is on the search, so I'll do that first and then do @effigies original points in order and propose some next steps for each: Search of NeuroBagel annotations of OpenNeuro datasets.We're ready for you to link to a participant-level search of OpenNeuro datasets. 🎉 Here is the link: https://query.neurobagel.org/?node=OpenNeuro.
Next steps:
Let us know if that looks good to you, getting this connection going is the main point we want to discuss with you. Annotate Poldrack-lab owned dataset(s) with NeuroBagel annotations and release, for testing.This is ready to go as well. We have updated our browser annotator: https://annotate.neurobagel.org/, it now makes valid data dictionaries you can use in the graph
The output data dictionary hasn't been updated yet to make use of bids-standard/bids-specification#1603 to add TermURLs for the Levels. That's a next step for us, so if you try out the annotation we'd appreciate some feedback on the output format - there is quite a bit in there that goes beyond the Levels-TermURL and we'd like to find a format that can be useful for you as well. Next:
Direct uploaders to NeuroBagel to annotateWe don't have an update on this, but one option for now would be to tell folks to add annotations for existing datasets directly through https://github.com/OpenNeuroDatasets-JSONLD and then they will become searchable. Next:
Begin proposal to make annotations maximally encodable in BIDSWe now have bids-standard/bids-specification#1603 to allow TermURLs in Levels and will start using this in our annotations output. There is still other information outside "Levels" that we need a place for like "isAbout" and "isPartOf" (see https://neurobagel.org/dictionaries/#assessment-tool), we'll start focusing on these early next year - any ideas appreciated. Next:
Add annotation widget when readyThe https://annotate.neurobagel.org/ tool is ready and up to date with our data model, but doesn't yet output pure BIDS data dictionaries. I don't think it makes sense to use it yet for data that goes on OpenNeuro directly. It could make sense though to provide a link to it and explain that annotations go into https://github.com/OpenNeuroDatasets-JSONLD, so that users can augment those until the annotations are fully BIDS. Next
Let me know if these point and especially the next steps make sense to you all. Also happy to chat in person if that works better. Until soon! |
Hi @surchs, thanks for the post.
Notes:
|
Great, thanks for the links and the bug report! Will reply directly on #2956 for search / annotate.
We already have ds000001 in https://github.com/orgs/OpenNeuroDatasets-JSONLD/repositories, so anything that isn't in there and you know a bit would be a good start. We're working on making the "I made an annotation now what" process more automatic (see OpenNeuroDatasets-JSONLD/.github#17), but for now our data dictionaries get dumped into https://github.com/neurobagel/openneuro-annotations and then we process them from there. So you could pick any dataset you don't see in there and try out the annotation (https://neurobagel.org/annotation_tool/)
Alright, fair. We're going to have a chat with Eric and Sam on the BEP36 and we can talk about the peaceful coexistence of BIDS.json and neurobagel.json there.
Cool, I like the idea of "completeness" feedback! Maybe @rmanaem and @alyssadai have some thoughts on this too. |
Based on today's meeting, we have the following steps short-term steps, in order of ease:
participants.tsv
and get aparticipants.json
to annotate their dataset. Probably easiest with a step-by-step doc or video.Long-term steps:
Does this match everybody's recollection? @surchs @nellh
The text was updated successfully, but these errors were encountered: