-
The University of Hong Kong
- Singapore ⇌ Shanghai ⇌ Hong Kong
-
16:07
(UTC +08:00) - qiushisun.github.io
- @qiushi_sun
- in/qiushi-sun
📜 Paper-with-Code
Data and Code for Reproducing "Global Relational Models of Source Code"
Implementation of the paper "Language-agnostic representation learning of source code from structure and context".
Official code repository of "BERTology Meets Biology: Interpreting Attention in Protein Language Models."
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
Datasets, tools, and benchmarks for representation learning of code.
Code and source for paper ``How to Fine-Tune BERT for Text Classification?``
Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners
Reference BLEU implementation that auto-downloads test sets and reports a version string to facilitate cross-lab comparisons
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"
Re-implementation of "CODE2SEQ: GENERATING SEQUENCES FROM STRUCTURED REPRESENTATIONS OF CODE"