Education, Science, Technology, Innovation and Life
Open Access
Sign In

Conditional Question Generation Model Based on Diffusion Model

Download as PDF

DOI: 10.23977/acss.2024.080705 | Downloads: 7 | Views: 244

Author(s)

Yujie Wei 1, Kui Jin 1

Affiliation(s)

1 School of Information, Yunnan Normal University, Kunming, China

Corresponding Author

Yujie Wei

ABSTRACT

The ultimate goal of conditional question generation is to generate high-quality questions with diversity, and the classifier-based diffusion model for conditional problem generation mainly categorizes the source data through classifiers so that high-quality problems can be generated. However, this kind of generation method has the drawback of complex joint training process and over-dependence on labeled data, which can lead to the lack of diversity and quality of generation. To tackle this problem, we propose a novel classifier-free diffusion model for conditional question generation. First, discrete text data are mapped into continuous vector data as input of the model in terms of an embedding function. Second, we design a classifier-free training method, which embeds the condition into the data fitting process, and the vector data completes the training under the condition. Finally, with the aid of rounding function, the samples generate the discrete text problem data. Experiments show that our proposed approach achieves a relatively decent average score and realizes better problem diversity than other state-of-the-art methods.

KEYWORDS

Deep Learning, Question Generation, Diffusion Model

CITE THIS PAPER

Yujie Wei, Kui Jin, Conditional Question Generation Model Based on Diffusion Model. Advances in Computer, Signals and Systems (2024) Vol. 8: 37-42. DOI: http://dx.doi.org/10.23977/acss.2024.080705.

REFERENCES

[1] Mostow J, Chen W. Generating instruction automatically for the reading strategy of self-questioning[C]. Artificial Intelligence in Education. IOS Press, 2009: 465-472.
[2] Kunichika H, Katayama T, Hirashima T, et al. Automated question generation methods for intelligent English learning systems and its evaluation[C]. Proc. of ICCE. 2004, 670.                                                                                                                                                          [3] Huang Y, He L. Automatic generation of short answer questions for reading comprehension assessment [J]. Natural Language Engineering, 2016, 22(3): 457-489.
[4] Du X, Shao J, Cardie C. Learning to ask: Neural question generation for reading comprehension[J]. arXiv preprint arXiv:1705.00106, 2017. 
[5] Hu, W., Liu, B., Ma, J., Zhao, D., & Yan, R. (2018). Aspect-based Question Generation. International Conference on Learning Representations.https://openreview.net/forum?id=rkRR1ynIf 
[6] Chai Z, Wan X. Learning to ask more: Semi-autoregressive sequential question generation under dual-graph interaction[C]. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020: 225-237. 
[7] Liu B, Zhao M, Niu D, et al. Learning to generate questions by learning what not to generate[C]. The world wide web conference. 2019: 1106-1118. 
[8] Wang W, Feng S, Wang D, et al. Answer-guided and semantic coherent question generation in open-domain conversation[C]. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019: 5066-5076. 
[9] Bao J, Gong Y, Duan N, et al. Question generation with doubly adversarial nets[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2018, 26(11): 2230-2239. 
[10] Chen Y, Wu L, Zaki M J. Reinforcement learning based graph-to-sequence model for natural question generation[J]. arXiv preprint arXiv:1908.04942, 2019. 
[11] Dong L, Yang N, Wang W, et al. Unified language model pre-training for natural language understanding and generation[J]. Advances in neural information processing systems, 2019, 32. 
[12] Johansen S, Juselius K. Maximum likelihood estimation and inference on cointegration—with appucations to the demand for money[J]. Oxford Bulletin of Economics and statistics, 1990, 52(2): 169-210. 
[13] Dathathri S, Madotto A, Lan J, et al. Plug and play language models: A simple approach to controlled text generation[J]. arXiv preprint arXiv:1912.02164, 2019. 
[14] Krause B, Gotmare A D, McCann B, et al. Gedi: Generative discriminator guided sequence generation[J]. arXiv preprint arXiv:2009.06367, 2020. 
[15] Yang K, Klein D. FUDGE: Controlled text generation with future discriminators[J]. arXiv preprint arXiv:2104.05218, 2021. 
[16] Li X, Thickstun J, Gulrajani I, et al. Diffusion-lm improves controllable text generation[J]. Advances in Neural Information Processing Systems, 2022, 35: 4328-4343. 
[17] Wolfe J H. Automatic question generation from text-an aid to independent study[C]. Proceedings of the ACM SIGCSE-SIGCUE technical symposium on Computer science and education. 1976: 104-112. 
[18] Yao X, Zhang Y. Question generation with minimal recursion semantics[C]. Proceedings of QG2010: The Third Workshop on Question Generation. 2010: 68-75. 
[19] Straach J, Truemper K. Learning to ask relevant questions[J]. Artificial Intelligence, 1999, 111(1-2): 301-327. 
[20] Ho J, Jain A, Abbeel P. Denoising diffusion probabilistic models[J]. Advances in neural information processing systems, 2020, 33: 6840-6851. 
[21] Rajpurkar P. Squad: 100,000+ questions for machine comprehension of text[J]. arXiv preprint arXiv:1606.05250, 2016. 
[22] Rajpurkar P, Jia R, Liang P. Know what you don't know: Unanswerable questions for SQuAD[J]. arXiv preprint arXiv:1806.03822, 2018.

Downloads: 27683
Visits: 469067

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.