Empirical Methods in Natural Language Generation: - download pdf or read online
By Regina Barzilay (auth.), Emiel Krahmer, Mariët Theune (eds.)
Natural language iteration (NLG) is a subfield of usual language processing (NLP) that's usually characterised because the research of immediately changing non-linguistic representations (e.g., from databases or different wisdom assets) into coherent usual language textual content. lately the sphere has developed considerably. probably crucial new improvement is the present emphasis on data-oriented equipment and empirical assessment. development in comparable components similar to computing device translation, discussion procedure layout and automated textual content summarization and the ensuing knowledge of the significance of language iteration, the expanding availability of appropriate corpora lately, and the association of shared initiatives for NLG, the place diversified groups of researchers improve and assessment their algorithms on a shared, held out info set have had a substantial effect at the box, and this e-book deals the 1st entire review of modern empirically orientated NLG research.
Read or Download Empirical Methods in Natural Language Generation: Data-oriented Methods and Empirical Evaluation PDF
Best nonfiction_6 books
- Elements of the doctrine of metres
- The Fundamentals of Signal Transmission - In Line, Waveguide, Fibre and Free Space
- Shiver And Spice (Harlequin Blaze)
- Jewel ornament of liberation
- Seismic Traveltime Tomography of the Crust and Lithosphere
Additional resources for Empirical Methods in Natural Language Generation: Data-oriented Methods and Empirical Evaluation
The alignment of answers to question types as a semantic role labelling task can also employ the same method . Our generation approach is also strongly related to work which constructs symbolic semantic structures via an assignment process in order to provide surface realisers with input . Our approach diﬀers in that we do not begin with a ﬁxed set of semantic labels. Additionally, our end goal is a dependency tree that encodes word precedence order, bypassing the surface realisation stage.
1–8 (2000) 3. : Paraphrasing with bilingual parallel corpora. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics (ACL 2005), Ann Arbor, Michigan, pp. 597–604 (2005) 4. : Probabilistic approaches for modeling text structure and their application to text-to-text generation. , Theune, M. ) Empirical Methods in NLG. LNCS (LNAI), vol. 5790, pp. 1–12. Springer, Heidelberg (2010) 5. : Information fusion in the context of multi-document summarization. In: Proceedings of the 37th Annual Meeting of the Association for Computational Linguistics (ACL 1999), Morristown, NJ, USA, pp.
We tested the performance of the approaches on the general task of string regeneration, a purely language modelling task. We found a signiﬁcant improvement in performance when using a spanning tree algorithm instead of an n-gram based generator to generate sentences, as measured by BLEU. We conclude that global search algorithms, like the AB spanning tree algorithm, improve upon the grammaticality of the candidate sentence, as compared to search algorithms that optimise local constraints. We also found that the Assignment-Based algorithm oﬀered performance improvements over a standard spanning tree algorithm.
Empirical Methods in Natural Language Generation: Data-oriented Methods and Empirical Evaluation by Regina Barzilay (auth.), Emiel Krahmer, Mariët Theune (eds.)