Naacl 2021 Paper Review About Semantic Parsing
This page is a draft page, to take notes on recent text-to-SQL methods that is accepted in NAACL 2021.
The related papers are listed in random order.
[1] DuoRAT: Towards Simpler Text-to-SQL Models
[2] Structure-Grounded Pretraining for Text-to-SQL
[3] ShadowGNN: Graph Projection Neural Network for Text-to-SQL Parser
[4] Meta-Learning for Domain Generalization in Semantic Parsing
[5] Learning from Executions for Semantic Parsing
[6] Learning to Synthesize Data for Semantic Parsing
It seems like most of the SOTA semantic parsing papers are from one person: Bailin Wang. He is the author of RAT-SQL and [4], [5], and [6].
[4]
Intuition: Gradient Steps that improve source-domain performance should also improve target-domain performance
- DG-MAML (Domain Generalization with Model-Agnostic Meta-Learning)
- a training algorithm that helps a parser acheive better domain generalization
- training domain and test(eval) domain are different.
- Meta-Train
- SGD of loss from virtual source domain
- Meta-Test
- compute loss from virtual target domain
- minimize the joint loss on both source and target domain -> require the gradient step beneficial to target domain
- Can be viewed as regularization of gradient updates in additional to objective of conventional supervised learning
- Applying DG-MAML on RAT-SQL
- evaluate on two zero-shot text-to-SQL - English/Chinese spider
- achieves near SOTA on Spider, and SOTA on Chinese spider
Questions to ask TBU More notes TBU
Leave a comment