Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The details of the pre-training. #16

Open
Swecamellia opened this issue Oct 25, 2024 · 2 comments
Open

The details of the pre-training. #16

Swecamellia opened this issue Oct 25, 2024 · 2 comments

Comments

@Swecamellia
Copy link

I am very interested in the pre-training process. Could you please provide the relevant code details?
That way, I can apply it to my own dataset. Thank you very much!

@BodongDu
Copy link

BodongDu commented Dec 3, 2024

I have the same request with you, I am looking forward to your reply!

@bighan123
Copy link

based on the paper details (Fig 2), I think the pre-training way is autoregressive modeling (like MAE), and the main difference is that the author try to predcit token both in spatial and frequency domains, but i am not sure wherer the code is entirely based on the paper description.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants