2018/2019/校招/春招/秋招/算法/机器学习(Machine Learning)/深度学习(Deep Learning)/自然语言处理(NLP)/C/C++/Python/面试笔记
Facilitating the design, comparison and sharing of deep text matching models.
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
Collection of sample applications using JUnit 5.
the tool box for net data creation
PyTorch Tutorial for Deep Learning Researchers
text classification based on CNN
Google AI 2018 BERT pytorch implementation
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Annotator for Chinese Text Corpus (UNDER DEVELOPMENT) 中文文本标注工具
Task-oriented dialog system toolkits
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
DSTC8 Track 1 Task 1 End-to-End Multi-Domain Dialog Challenge Result: