Training Results of some models against DROP dataset (AI2)
Training Results of the models on DROP dataset.
Training Results of the models on DROP dataset.
第一次做 CTF 逆向题,太难了!
Source: https://stackoverflow.com/questions/16987670/dynamic-programming-why-knuths-improvement-to-optimal-binary-search-tree-on2
Computer Network 2021FA coding assignment 1: DNS-Relay
1 RGB and YUV Colorspace
填坑填坑!
这学期开了模电,所以要用到 Veribug语言啦。
One good tutorial about CMake is here. You can also take the official document as a reference.
This is the proof by Namita Tiwari. Thanks!!!
Use docker as your virtual environment for your project testing.
Wakatime is something you can use to record your coding (touching fish) time.
Codespaces: An cloud IDE made by GitHub & Microsoft, previously named “Visual Studio Online”.
Use your own CSS files to customize the site.
Build your repo automatically with Tarvis CI.
This is the review of album “folklore” by Taylor Swift, released on 2020/7/24.
Matplotlib 3D Toolkit Note
From Prof. Xiang Luo
Citespace: 一款文献可视化分析软件
“8 numbers problem” in CPII 2020 Spring: solution and some related basic knowledge used in it.
没想到刚玩过 NieR:Automata 不久,被游戏中治愈的剧情和吹爆的配乐还有…
记录了一点 Openpose 的配置过程及踩过的坑
常用的一点 Git 操作
Paper reading for [NeurIPS 2020] Denoising Diffusion Probabilistic Models by Jonathan Ho, Ajay Jain and Pieter Abbeel. Paper Link is here at NeurIPS Proceedi...
Paper reading for [CVPR 2021] Taming Transformers for High-Resolution Image Synthesis Aka. #VQGAN at CVPR 2021 (ORAL) by Patrick Esser et al. Arxiv Link is h...
Paper reading for [CVPR 2022] Learning to Answer Questions in Dynamic Audio-Visual Scenarios. Arxiv Link is here: https://arxiv.org/pdf/2203.14072.pdf
This is my reading note for Multimodal Few-Shot Learning with Frozen Language Models 🌐 NeurIPS 2021.
This is my reading note for MobileNets series.
In this blog post, we will go through several classic CNN structures that builds the backbones of Computer Vision.
This is my reading note for [NeurIPS 2019] Levenshtein Transformer.
This is my reading note for [ICLR 2018] Unsupervised Neural Machine Translation.
This is my reading note for [ICLR 2019] Parameter-Efficient Transfer Learning for NLP.
This is my reading note for [ICLR 2018] Non-Autoregressive Neural Machine Translation.
This is my reading note for [NeurIPS 2014] Sequence to Sequence Learning with Neural Networks.
This is my reading note for [NAACL 2019] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.
This is my reading note for [NeurIPS 2017] Attention is All You Need.
This is my reading report for Multimedia Analysis 2021 Spring in USTC.