[1] Devlin J , Chang M W , Lee K , et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[J]. 2018.[2] Mikolov T, Chen K, Corrado G, et al. Efficient Estimation of Word Representations in Vector Space[J]. Computer Science, 2013.[3] Joulin A , Grave E , Bojanowski P , et al. Bag of Tricks for Efficient Text Classification[J]. 2016.[4] Huang P S , He X , Gao J , et al. Learning deep structured semantic models for web search using clickthrough data[C]// Proceedings of the 22nd ACM international conference on Conference on information & knowledge management. ACM, 2013.[5] Lan Z, Chen M, Goodman S, et al. Albert: A lite bert for self-supervised learning of language representations[J]. arXiv preprint arXiv:1909.11942, 2019.[6] Xiangyang Zhou∗, Lu Li∗, Dong D , et al. Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018. 作者简介:郭宗沂,58同城 TEG技术工程平台群 AI Lab 算法工程师 部门简介:58同城TEG技术工程平台群AI Lab,旨在推动AI技术在58同城的落地,打造AI中台能力,以提高前台业务人效和用户体验。AI Lab目前负责的产品包括:智能客服、语音机器人、智能写稿、智能语音分析平台、智能营销系统、AI算法平台、语音识别等,未来将持续加速创新,拓展AI应用。部门详细介绍可点击:ailab.58.com