复合维生素是什么| 引体向上练什么肌肉| 糖尿病的人可以吃什么水果| 板蓝根长什么样| 减肥早餐吃什么最好| cl是什么牌子| 胸膈痞闷是什么症状| 618是什么日子| 怀孕生气对胎儿有什么影响| 巨细胞病毒阳性什么意思| 下缘达宫颈内口是什么意思| 肛周水泡是什么病| 三高挂号挂什么科| 孕期吃什么长胎不长肉| 普拉提是什么意思| 合菜是什么| 阑尾切除后有什么影响和后遗症| npc是什么| 肚脐眼连着什么器官| 强迫是什么意思| 梦见厕所是什么预兆| 头发湿着睡觉有什么害处| 为什么越睡越困越疲惫| 什么动物眼睛是红色的| 晚上11点是什么时辰| 屁很多是什么原因造成的| 长期服用丙戊酸钠有什么副作用| iu是什么单位| 2038年是什么年| 晕车吃什么| crpa是什么细菌| kcal是什么单位| 牙齿矫正挂什么科| 私事是什么意思| 骆驼趾是什么意思| 什么叫智齿牙| 山楂和什么不能一起吃| 打猎是什么意思| 母乳是什么味道| 超标是什么意思| 炒什么菜适合拌面| 什么狗聪明听话又好养| 为什么有些人特别招蚊子| 没事找事是什么意思| 淋巴结发炎吃什么药| 寄生树有什么功效作用| 肺结节吃什么水果好| 乙肝两对半15阳性是什么意思| 开小灶是什么意思| 缘木求鱼什么意思| 早上做什么运动最好| 六月六吃什么| 阴道流黄水是什么病| 茵陈有什么功效| 秋葵有什么营养价值| 疳积是什么病| 佐助是什么意思| 故宫里面有什么| 阴虚火旺吃什么中成药| 什么叫电子版照片| 什么东西能吃能喝又能坐| 脑筋急转弯什么东西越洗越脏| 下面长痘痘是什么原因| 孩子爱咬指甲是什么原因| dem是什么| 黄色配什么颜色好看| 做肠镜要做什么准备| 为什么会突然长智齿| 肝fnh是什么病| 内裤发黄是什么原因呢| 刚生完孩子可以吃什么水果| 中医说的湿气重是什么意思| 霸凌是什么意思| 麦芒是什么意思| 妇联是干什么的| 脸基尼是什么意思| 头发发黄是什么原因造成的| 什么是宫腔镜手术| 睡觉天天做梦是什么原因| au750是什么金属| 睡觉为什么要枕枕头| 鸡肠炎用什么药效果好| 红细胞阳性是什么意思| 氯化钠是什么东西| 浪子回头是什么意思| 阉了是什么意思| 宝宝吐奶是什么原因引起的| 烧包是什么意思| 什么是梨形身材| 植物生长需要什么| 梦见怀孕是什么意思| 什么网站可以看黄色视频| 保花保果用什么药最好| 太阳为什么能一直燃烧| 前列腺增生伴钙化是什么意思| 异地结婚登记需要什么证件| 什么血型会导致不孕| 江西什么最出名| 什么时候恢复的高考| 年终奖一般什么时候发| 喘不上气挂什么科| 母亲节送什么礼物好| 后果自负是什么意思| 雌堕什么意思| 什么原因会引起胎停| 四物汤什么时候喝| 山西的简称是什么| 肾不好有什么症状| 纹身纹什么招财好运| 宅心仁厚是什么意思| 沉住气是什么意思| 痰多吃什么化痰| 夏至是什么意思| 老打嗝什么原因| 吃月饼是什么生肖| 小水母吃什么| 红枣和枸杞一起泡水喝有什么作用| 心烦意乱焦躁不安吃什么药| 各类病原体dna测定是检查什么| 喝脱脂牛奶有什么好处| 鹿角粉有什么功效和作用| 台风什么时候登陆| 便秘用什么药效果好| 尿生化是查什么的| 血清和血浆有什么区别| 心肌缺血挂什么科| 鄙人不才是什么意思| 有什么| 余数是什么| 牛肚是牛的什么部位| 食邑万户是什么意思| 婚检检查什么项目| 福肖指什么生肖| 小孩经常尿床是什么原因| 9月9日是什么星座| 双肺呼吸音粗是什么意思| 手机为什么突然关机| 食古不化是什么意思| 倒班是什么意思| 九重紫纪咏结局是什么| 便秘吃什么润肠通便| 耳鸣是什么原因引起的嗡嗡的响| 为什么会得肺癌| 足本是什么意思| 十二指肠球部溃疡吃什么药| 头顶疼是什么原因引起的| 梦见拖地是什么意思| 指甲长得快说明什么| 松针是什么| 氯胺酮是什么| 急性肾炎什么症状| 没有力气是什么原因| 脑梗挂什么科| 易烊千玺的爸爸是干什么的| 过劳肥是什么意思| 动脉抽血是做什么检查| kangol是什么牌子| 口痰多是什么原因| 支气管激发试验阴性是什么意思| a型血的人是什么性格| 扶乩是什么意思| 为什么心脏会隐隐作痛| 嘴紫是什么原因| 送什么礼品好| 中华草龟吃什么| 长颈鹿的脖子像什么| 口臭口苦吃什么药最好| 低血钾是什么原因引起的| 什么花最好看| 臭氧是什么东西| qaq什么意思| 肺炎支原体阳性是什么意思| 纤维灶是什么意思| 白猫进家有什么预兆| 胸是什么| 脖子疼是什么原因| 媚是什么意思| 胃疼的人吃什么最养胃| 龙阳之好是什么意思| 下巴长痘什么原因| 梅尼埃综合症是什么病| 什么水果含钾| 暂缓参军是什么意思| 肺鳞癌是什么意思| 什么是音色| 宿命是什么意思| 维生素b2有什么作用和功效| 吃菠萝蜜有什么好处| 天可以加什么偏旁| 丁香茶有什么作用和功效| 肝内低回声区是什么意思| 高级护理是干什么的| 农历12月是什么月| 女人吃鹅蛋有什么好处| 病人化疗期间吃什么好| 为什么会长肥胖纹| xo什么意思| 山人是什么意思| 萧字五行属什么| 健脾益气是什么意思| 吃什么会死| 头发爱出油什么原因| 高血压需要注意些什么| 清华大学书记什么级别| 灵魂伴侣是指什么意思| 男人吃秋葵有什么好处| 什么时间泡脚最好| 04年是什么生肖| 为什么会有乳腺结节| 前戏是什么意思| 10月4号什么星座| 小女子这厢有礼了什么意思| 蓝莓不能和什么一起吃| 柿子与什么食物相克| 做完雾化为什么要漱口| 铁观音是什么茶| 什么血型和什么血型不能生孩子| 嗓子烧灼感是什么原因| 犹太人为什么有钱| 鄙人什么意思| 头孢吃多了有什么副作用| 明天代表什么生肖| 静脉曲张挂什么号| 为什么老放屁| cd是什么意思啊| 包公代表什么生肖| 阴茎不硬是什么原因| 好奇害死猫什么意思| 金骏眉茶是什么茶| 虚岁29岁属什么生肖| 世界上什么动物牙齿最多| 网黄是什么意思| 老师家访需要准备什么| 丹参滴丸治什么病| 椭圆脸适合什么发型男| 肌酐高了会出现什么问题| 乏是什么单位| 预估是什么意思| 肺栓塞是什么意思| 车水马龙是什么意思| 阁字五行属什么| 减肥为什么不让吃南瓜| 脾胃虚弱吃什么药好| 验孕棒什么时候测最准| 猫牙米是什么米| 赤子是什么意思| 旭五行属性是什么| eo什么意思| loc是什么意思| 腿上长痣代表什么| 放的屁很臭是什么原因| 盐是什么味道| 阑尾炎是什么症状| 心脏呈逆钟向转位什么意思| 清明是什么季节| 右手中指发麻是什么原因| 今年30岁属什么生肖| 低密度脂蛋白偏高吃什么好| 生长因子是什么| 吃什么东西排酸最快| 茭白是什么植物| 乌龟喜欢吃什么| pcv是什么意思| ems代表什么| 百度
百度 这是日前发生在湖北通山县闯王镇刘家岭村的场景。

Semantic parsing is the task of converting a natural language utterance to a logical form: a machine-understandable representation of its meaning.[1] Semantic parsing can thus be understood as extracting the precise meaning of an utterance. Applications of semantic parsing include machine translation,[2] question answering,[1][3] ontology induction,[4] automated reasoning,[5] and code generation.[6][7] The phrase was first used in the 1970s by Yorick Wilks as the basis for machine translation programs working with only semantic representations.[8] Semantic parsing is one of the important tasks in computational linguistics and natural language processing.

Semantic Parsing System Architecture

Semantic parsing maps text to formal meaning representations. This contrasts with semantic role labeling and other forms of shallow semantic processing, which do not aim to produce complete formal meanings.[9] In computer vision, semantic parsing is a process of segmentation for 3D objects.[10][11]

Major levels of linguistic structure

History and Background

edit

Early research of semantic parsing included the generation of grammar manually,[12] as well as utilizing applied programming logic.[13] In the 2000s, most of the work in this area involved the creation/learning and use of different grammars and lexicons on controlled tasks,[14][15] particularly general grammars such as SCFGs.[16] This improved upon manual grammars primarily because they leveraged the syntactical nature of the sentence, but they still could not cover enough variation and were not robust enough to be used in the real world. However, following the development of advanced neural network techniques, especially the Seq2Seq model,[17] and the availability of powerful computational resources, neural semantic parsing started emerging. Not only was it providing competitive results on the existing datasets, but it was robust to noise and did not require a lot of supervision and manual intervention.

The current transition of traditional parsing to neural semantic parsing has not been perfect though. Neural semantic parsing, even with its advantages, still fails to solve the problem at a deeper level. Neural models like Seq2Seq treat the parsing problem as a sequential translation problem, and the model learns patterns in a black-box manner, which means we cannot really predict whether the model is truly solving the problem. Intermediate efforts and modifications to the Seq2Seq to incorporate syntax and semantic meaning have been attempted,[18][19] with a marked improvement in results, but there remains a lot of ambiguity to be taken care of.

Types

edit

Shallow Semantic Parsing

edit

Shallow semantic parsing is concerned with identifying entities in an utterance and labelling them with the roles they play. Shallow semantic parsing is sometimes known as slot-filling or frame semantic parsing, since its theoretical basis comes from frame semantics, wherein a word evokes a frame of related concepts and roles. Slot-filling systems are widely used in virtual assistants in conjunction with intent classifiers, which can be seen as mechanisms for identifying the frame evoked by an utterance.[20][21] Popular architectures for slot-filling are largely variants of an encoder-decoder model, wherein two recurrent neural networks (RNNs) are trained jointly to encode an utterance into a vector and to decode that vector into a sequence of slot labels.[22] This type of model is used in the Amazon Alexa spoken language understanding system.[20] This parsing follow an unsupervised learning techniques.

Deep Semantic Parsing

edit

Deep semantic parsing, also known as compositional semantic parsing, is concerned with producing precise meaning representations of utterances that can contain significant compositionality.[23] Shallow semantic parsers can parse utterances like "show me flights from Boston to Dallas" by classifying the intent as "list flights", and filling slots "source" and "destination" with "Boston" and "Dallas", respectively. However, shallow semantic parsing cannot parse arbitrary compositional utterances, like "show me flights from Boston to anywhere that has flights to Juneau". Deep semantic parsing attempts to parse such utterances, typically by converting them to a formal meaning representation language. Nowadays, compositional semantic parsing are using Large Language Models to solve artificial compositional generalization tasks such as SCAN.[24]

Neural Semantic Parsing

edit

Semantic parsers play a crucial role in natural language understanding systems because they transform natural language utterances into machine-executable logical structures or programmes. A well-established field of study, semantic parsing finds use in voice assistants, question answering, instruction following, and code generation. Since Neural approaches have been available for two years, many of the presumptions that underpinned semantic parsing have been rethought, leading to a substantial change in the models employed for semantic parsing. Though Semantic neural network and Neural Semantic Parsing [25] both deal with Natural Language Processing (NLP) and semantics, they are not same. The models and executable formalisms used in semantic parsing research have traditionally been strongly dependent on concepts from formal semantics in linguistics, like the λ-calculus produced by a CCG parser. Nonetheless, more approachable formalisms, like conventional programming languages, and NMT-style models that are considerably more accessible to a wider NLP audience, are made possible by recent work with neural encoder-decoder semantic parsers. We'll give a summary of contemporary neural approaches to semantic parsing and discuss how they've affected the field's understanding of semantic parsing.

Representation languages

edit

Early semantic parsers used highly domain-specific meaning representation languages,[26] with later systems using more extensible languages like Prolog,[13] lambda calculus,[16] lambda dependency-based compositional semantics (λ-DCS),[27] SQL,[28][29] Python,[30] Java,[31] the Alexa Meaning Representation Language,[20] and the Abstract Meaning Representation (AMR). Some work has used more exotic meaning representations, like query graphs,[32] semantic graphs,[33] or vector representations.[34]

Models

edit

Most modern deep semantic parsing models are either based on defining a formal grammar for a chart parser or utilizing RNNs to directly translate from a natural language to a meaning representation language. Examples of systems built on formal grammars are the Cornell Semantic Parsing Framework,[35] Stanford University's Semantic Parsing with Execution (SEMPRE),[3] and the Word Alignment-based Semantic Parser (WASP).[36]

Datasets

edit

Datasets used for training statistical semantic parsing models are divided into two main classes based on application: those used for question answering via knowledge base queries, and those used for code generation.

Question answering

edit
 
Semantic Parsing for Conversational Question Answering

A standard dataset for question answering via semantic parsing is the Air Travel Information System (ATIS) dataset, which contains questions and commands about upcoming flights as well as corresponding SQL.[28] Another benchmark dataset is the GeoQuery dataset which contains questions about the geography of the U.S. paired with corresponding Prolog.[13] The Overnight dataset is used to test how well semantic parsers adapt across multiple domains; it contains natural language queries about 8 different domains paired with corresponding λ-DCS expressions.[37] Recently, semantic parsing is gaining significant popularity as a result of new research works and many large companies, namely Google, Microsoft, Amazon, etc. are working on this area. One on the recent works of Semantic Parsing for question answering is attached here.[38] Shown in this picture is a representation of an example conversation from SPICE. The left column shows dialogue turns (T1–T3) with user (U) and system (S) utterances. The middle column shows the annotations provided in CSQA. Blue boxes on the right show the sequence of actions (AS) and corresponding SPARQL semantic parses (SP).

Code generation

edit

Popular datasets for code generation include two trading card datasets that link the text that appears on cards to code that precisely represents those cards. One was constructed linking Magic: The Gathering card texts to Java snippets; the other by linking Hearthstone card texts to Python snippets.[31] The IFTTT dataset[39] uses a specialized domain-specific language with short conditional commands. The Django dataset[40] pairs Python snippets with English and Japanese pseudocode describing them. The RoboCup dataset[41] pairs English rules with their representations in a domain-specific language that can be understood by virtual soccer-playing robots.

Application Areas

edit

Within the field of natural language processing (NLP), semantic parsing deals with transforming human language into a format that is easier for machines to understand and comprehend. This method is useful in a number of contexts:

  • Voice Assistants and Chatbots: Semantic parsing enhances the quality of user interaction in devices such as smart speakers and chatbots for customer service by comprehending and answering user inquiries in natural language.
  • Information Retrieval: It improves the comprehension and processing of user queries by search engines and databases, resulting in more precise and pertinent search results.
  • Machine Translation: To improve the quality and context of translation, machine translation entails comprehending the semantics of one language in order to translate it into another accurately.
  • Text Analytics: Business intelligence and social media monitoring benefit from the meaningful insights that can be extracted from text data through semantic parsing. Examples of these insights include sentiment analysis, topic modelling, and trend analysis.
  • Question Answering Systems: Found in systems such as IBM Watson, these systems assist in comprehending and analyzing natural language queries in order to deliver precise responses. They are particularly helpful in areas such as customer service and educational resources.
  • Command and Control Systems: Semantic parsing aids in the accurate interpretation of voice or text commands used to control systems in applications such as software interfaces or smart homes.
  • Content Categorization: It is a useful tool for online publishing and digital content management as it aids in the classification and organization of vast amounts of textual material by analyzing its semantic content.
  • Technologies related to accessibility: Helps create tools for the disabled, such as sign language interpretation and text to speech conversion.
  • Legal and Healthcare Informatics: Semantic parsing can extract and structure important information from legal documents and medical records to support research and decision-making.

Semantic parsing aims to improve various applications' efficiency and efficacy by bridging the gap between human language and machine processing in each of these domains.

Evaluation

edit

Depending on the type of meaning representation parsed into and underlying goals to perform parsing, evaluation metrics vary accordingly:

For meaning representations that can be represented as graphs, such as Abstract Meaning Representation, graph similarity between system output and reference graph allows evaluation even of partial successes in parsing.[42]

Evaluation using F1-measure along with precision and recall is also used, e.g. for some graph-based or logical form meaning representations.[42][43]

If semantic parsing is framed as a sequence-to-sequence (seq2seq) task, more traditional metrics used in natural language processing for comparing sequences, such as BLEU, can be utilized.[44][45]:?7?

Aside from metrics rewarding partial successes, a more strict metric is that of exact match of system output and the reference. Accuracy, as percentage of samples with correctly predicted meaning, is reported for instance for some text-to-SQL parsing tasks.[46]:?6f? When compositional generalization abilities of semantic parsers should be tested accuracy is used too.[47][48][49]

For executable semantic parsing, not only the meaning representation predicted can be evaluated, but also the result of executing the prediction. As a first step, however, the predicted representation needs to be syntactically well-formed to allow execution. For all well-formed system outputs, the execution result can be compared to the result of executing the gold standard representation.[44]:?5,?15?

See also

edit

References

edit
  1. ^ a b Jia, Robin; Liang, Percy (2025-08-06). "Data Recombination for Neural Semantic Parsing". arXiv:1606.03622 [cs.CL].
  2. ^ Andreas, Jacob, Andreas Vlachos, and Stephen Clark. "Semantic parsing as machine translation." Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Vol. 2. 2013.
  3. ^ a b Berant, Jonathan, et al. "Semantic Parsing on Freebase from Question-Answer Pairs." EMNLP. Vol. 2. No. 5. 2013.
  4. ^ Poon, Hoifung, and Pedro Domingos. "Unsupervised ontology induction from text." Proceedings of the 48th annual meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2010.
  5. ^ Kaliszyk, Cezary, Josef Urban, and Ji?í Vysko?il. "Automating formalization by statistical and semantic parsing of mathematics." International Conference on Interactive Theorem Proving. Springer, Cham, 2017.
  6. ^ Rabinovich, Maxim; Stern, Mitchell; Klein, Dan (2025-08-06). "Abstract Syntax Networks for Code Generation and Semantic Parsing". arXiv:1704.07535 [cs.CL].
  7. ^ Yin, Pengcheng; Neubig, Graham (2025-08-06). "A Syntactic Neural Model for General-Purpose Code Generation". arXiv:1704.01696 [cs.CL].
  8. ^ Wilks, Y. and Fass, D. (1992) The Preference Semantics Family, In Computers and Mathematics with Applications, Volume 23, Issues 2-5, Pages 205-221.
  9. ^ Hoifung Poon, Pedro Domingos Unsupervised Semantic Parsing , Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing, 2009
  10. ^ Armeni, Iro, et al. "3d semantic parsing of large-scale indoor spaces." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2016.
  11. ^ Qi, Charles R., et al. "Pointnet: Deep learning on point sets for 3d classification and segmentation." Proceedings of the IEEE conference on computer vision and pattern recognition. 2017.
  12. ^ Warren, D. H. D.; Pereira, F. C. N. (1982). "An efficient easily adaptable system for interpreting natural language queries" (PDF). Comput. Linguist. 8 (3–4): 110–122.
  13. ^ a b c Zelle, J. M.; Mooney, R. J. (1996). "Learning to parse database queries using inductive logic programming". Proceedings of the National Conference on Artificial Intelligence: 1050–1055. hdl:1721.1/7095.
  14. ^ Zettlemoyer; Collins (2005). "Learning to map sentences to logical form: Structured classification with probabilistic categorial grammars" (PDF). Proceedings of the Twenty-First Conference on Uncertainty in Artificial Intelligence, UAI'05: 658–666.
  15. ^ Kwiatkowski, T.; et al. (2010). "Inducing probabilistic ccg grammars from logical form with higher-order unification". Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing: 1223–1233.
  16. ^ a b Wong, Yuk Wah; Mooney, Raymond (2007). "Learning synchronous grammars for semantic parsing with lambda calculus". Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics.
  17. ^ Dong, L.; Lapata, M. (2016). "Language to logical form with neural attention". Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Volume 1: Long Papers: 33–43. arXiv:1601.01280.
  18. ^ Yin, Pengcheng; Neubig, Graham (2017). "A Syntactic Neural Model for General-Purpose Code Generation". Proceedings of 55th Annual Meeting of the Association for Computational Linguistics. Volume 1: Long Papers: 440–450. doi:10.18653/v1/P17-1041.
  19. ^ Shi, Tianze; et al. (2018). "A Incsql: Training incremental text-to-sql parsers with non-deterministic oracles". arXiv Preprint. arXiv:1809.05054.
  20. ^ a b c Kumar, Anjishnu; et al. (2018). "Just ASK: Building an Architecture for Extensible Self-Service Spoken Language Understanding". arXiv preprint. arXiv:1711.00549.
  21. ^ Bapna, Ankur; et al. (2017). "Towards zero-shot frame semantic parsing for domain scaling". arXiv preprint. arXiv:1707.02363.
  22. ^ Liu, Bing; Lane, Ian (2016). "Attention-based recurrent neural network models for joint intent detection and slot filling". arXiv preprint. arXiv:1609.01454.
  23. ^ Liang, Percy; Potts, Christopher (2015). "Bringing machine learning and compositional semantics together". Annual Review of Linguistics. 1 (1): 355–376. doi:10.1146/annurev-linguist-030514-125312.
  24. ^ Drozdov, Andrew; et al. (2022). "Compositional Semantic Parsing with Large Language Models". ArXiv preprint. arXiv:2209.15003.
  25. ^ Matt Gardner, Pradeep Dasigi, Srinivasan Iyer, Alane Suhr, Luke Zettlemoyer. "Neural Semantic Parsing" Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts, July 2018.
  26. ^ Woods, William A. (1979). Semantics for a question-answering system. Outstanding dissertations in the computer sciences. Vol. 27. New York: Garland. ISBN 0-8240-4419-3.
  27. ^ Liang, Percy (2013). "Lambda dependency-based compositional semantics". arXiv preprint. arXiv:1309.4408.
  28. ^ a b Hemphill, Charles T.; et al. (1990). "The ATIS spoken language systems pilot corpus". Speech and Natural Language: Proceedings of a Workshop Held at Hidden Valley, Pennsylvania, June 24–27, 1990.
  29. ^ Iyer, Srinivasan; et al. (2017). "Learning a neural semantic parser from user feedback". arXiv preprint. arXiv:1704.08760.
  30. ^ Yin, Pengcheng; Neubig, Graham (2017). "A syntactic neural model for general-purpose code generation". arXiv preprint. arXiv:1704.01696.
  31. ^ a b Ling, Wang; et al. (2016). "Latent predictor networks for code generation". arXiv preprint. arXiv:1603.06744.
  32. ^ Yih, Scott Wen-tau; et al. (2015). "Semantic parsing via staged query graph generation: Question answering with knowledge base" (PDF). Proceedings of the Joint Conference of the 53rd Annual Meeting of the ACL and the 7th International Joint Conference on Natural Language Processing of the AFNLP.
  33. ^ Reddy, Siva; et al. (2014). "Large-scale semantic parsing without question-answer pairs". Transactions of the Association of Computational Linguistics. 2 (1): 377–392.
  34. ^ Guu, Kelvin; et al. (2015). "Traversing knowledge graphs in vector space". arXiv preprint. arXiv:1506.01094.
  35. ^ Artzi, Yoav (2013). "Cornell SPF: Cornell semantic parsing framework". arXiv preprint. arXiv:1311.3011.
  36. ^ Wong, Yuk Wah; Mooney, Raymond J. (2006). Learning for semantic parsing with statistical machine translation. Proceedings of the main conference on Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics -. Association for Computational Linguistics. pp. 439–446. CiteSeerX 10.1.1.135.7209. doi:10.3115/1220835.1220891.
  37. ^ Wang, Yushi, Jonathan Berant, and Percy Liang. "Building a semantic parser overnight." Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Vol. 1. 2015.
  38. ^ Laura Perez-Beltrachini, Parag Jain, Emilio Monti, Mirella Lapata. Semantic Parsing for Conversational Question Answering over Knowledge Graphs 'Proceedings on EACL 2023'. 28 January 2023.
  39. ^ Quirk, Chris, Raymond Mooney, and Michel Galley. "Language to code: Learning semantic parsers for if-this-then-that recipes." Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Vol. 1. 2015.
  40. ^ Oda, Yusuke, et al. "Learning to generate pseudo-code from source code using statistical machine translation (t)." Automated Software Engineering (ASE), 2015 30th IEEE/ACM International Conference on. IEEE, 2015.
  41. ^ Kuhlmann, Gregory, et al. "Guiding a reinforcement learner with natural language advice: Initial results in RoboCup soccer." The AAAI-2004 workshop on supervisory control of learning and adaptive systems. 2004.
  42. ^ a b Oepen, Stephan; Abend, Omri; Hajic, Jan; Hershcovich, Daniel; Kuhlmann, Marco; O’Gorman, Tim; Xue, Nianwen; Chun, Jayeol; Straka, Milan; Uresova, Zdenka (2019). "MRP 2019: Cross-Framework Meaning Representation Parsing" (PDF). Proceedings of the Shared Task on Cross-Framework Meaning Representation Parsing at the 2019 Conference on Natural Language Learning. Association for Computational Linguistics. p. 1–27. doi:10.18653/v1/K19-2001. Retrieved 2025-08-06.
  43. ^ Van Noord, Rik; Abzianidze, Lasha; Haagsma, Hessel; Bos, Johan (2018). Calzolari, Nicoletta; Choukri, Khalid; Cieri, Christopher; Declerck, Thierry; Goggi, Sara; Hasida, Koiti; Isahara, Hitoshi; Maegaard, Bente; Mariani, Joseph; Mazo, Hélène; Moreno, Asuncion; Odijk, Jan; Piperidis, Stelios; Tokunaga, Takenobu (eds.). Evaluating Scoped Meaning Representations (PDF). Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018). Miyazaki, Japan: European Language Resources Association (ELRA). pp. 1685–1693. Retrieved 2025-08-06.
  44. ^ a b Kamath, Aishwarya; Das, Rajarshi (2019). "A Survey on Semantic Parsing". Automated Knowledge Base Construction (AKBC). doi:10.24432/C5WC7D. Retrieved 2025-08-06.
  45. ^ Lee, Celine; Gottschlich, Justin; Roth, Dan (2025-08-06). "Toward Code Generation: A Survey and Lessons from Semantic Parsing". arXiv:2105.03317 [cs.SE].
  46. ^ Liang, Percy (2025-08-06). "Learning executable semantic parsers for natural language understanding". Communications of the ACM. 59 (9): 68–76. arXiv:1603.06677. doi:10.1145/2866568. ISSN 0001-0782.
  47. ^ Lake, Brenden; Baroni, Marco (2018). Dy, Jennifer; Krause, Andreas (eds.). "Generalization without Systematicity: On the Compositional Skills of Sequence-to-Sequence Recurrent Networks" (PDF). Proceedings of the 35th International Conference on Machine Learning. 80. Stockholmsm?ssan, Stockholm Sweden: 2873–2882. Retrieved 2025-08-06.
  48. ^ Keysers, Daniel; Sch?rli, Nathanael; Scales, Nathan; Buisman, Hylke; Furrer, Daniel; Kashubin, Sergii; Momchev, Nikola; Sinopalnikov, Danila; Stafiniak, Lukasz; Tihon, Tibor; Tsarkov, Dmitry; Wang, Xiao; Zee, Marc van; Bousquet, Olivier (2019). Measuring Compositional Generalization: A Comprehensive Method on Realistic Data. International Conference on Learning Representations (ICLR). Retrieved 2025-08-06.
  49. ^ Kim, Najoung; Linzen, Tal (2020). "COGS: A Compositional Generalization Challenge Based on Semantic Interpretation" (PDF). Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics. pp. 9087–9105. doi:10.18653/v1/2020.emnlp-main.731. Retrieved 2025-08-06.
慢性荨麻疹吃什么药 一语惊醒梦中人是什么意思 抄经书有什么好处 什么是筋膜 梦见自己被警察抓了是什么意思
沙弗莱是什么宝石 饿了么什么时候成立的 苍白的什么 动物的尾巴有什么用处 c是什么单位
启攒是什么意思 心里害怕紧张恐惧是什么症状 共轭什么意思 唐宋元明清前面是什么 birkin是什么意思
肝看什么科 脸大剪什么发型好看 虾子不能和什么一起吃 口腔溃疡喝什么饮料 扳机指是什么原因造成的
猎德村为什么那么有钱chuanglingweilai.com 反流性食管炎b级是什么意思hcv7jop6ns5r.cn 排卵期同房要注意什么hcv9jop2ns0r.cn 耳浴10分钟什么意思bjhyzcsm.com 三星堆为什么叫三星堆hcv8jop2ns9r.cn
胃不好的人吃什么养胃hcv9jop6ns5r.cn 吃什么润肠通便hcv8jop6ns4r.cn yearcon是什么牌子jiuxinfghf.com 眼皮老是跳是什么原因hcv8jop3ns4r.cn 属牛男最在乎女人什么hcv9jop2ns5r.cn
吃什么能让月经量增多hcv9jop7ns5r.cn 蚊虫叮咬擦什么药膏hcv8jop9ns8r.cn 乳腺增生不能吃什么jinxinzhichuang.com 气血不足是什么引起的hcv8jop0ns0r.cn mmf是什么药hcv8jop2ns5r.cn
木棉是什么hcv9jop4ns7r.cn 苓是什么意思hcv8jop2ns6r.cn 金先读什么hcv7jop9ns7r.cn 慢性肠炎有什么症状hcv8jop3ns1r.cn 1935年是什么生肖hcv8jop4ns2r.cn
百度