Title: | Distributional semantics using neural networks: technical report no. DCSE/TR-2016-04 |
Authors: | Svoboda, Lukáš |
Issue Date: | 2016 |
Publisher: | University of West Bohemia |
Document type: | zpráva report |
URI: | http://www.kiv.zcu.cz/cz/vyzkum/publikace/technicke-zpravy/ http://hdl.handle.net/11025/25377 |
Keywords: | neuronové sítě;sémantika;zpracování přirozeného jazyka |
Keywords in different language: | neural networks;semantic;natural language processing |
Abstract in different language: | During recent years, neural networks show crucial improvement in catching semantics of words or sentences. They also show improves in Language modeling, which is crucial for many tasks among Natural Language Processing (NLP). One of the most used architectures of Artificial Neural Networks (ANN) in NLP are Recurrent Neural Networks (RNN) that do not use limited size of context. By using recurrent connections, information can cycle in side these networks for arbitrarily long time. Thesis summarizes the state-of-the-art approaches to distributional semantics. Thesis also focus on further use of ANN among NLP problems. |
Rights: | © University of West Bohemia in Pilsen |
Appears in Collections: | Zprávy / Reports (KIV) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Svoboda.pdf | Plný text | 796,49 kB | Adobe PDF | View/Open |
Please use this identifier to cite or link to this item:
http://hdl.handle.net/11025/25377
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.