TY - GEN
T1 - Automatic student writing evaluation
T2 - 11th International Conference on Learning Analytics and Knowledge: The Impact we Make: The Contributions of Learning Analytics to Learning, LAK 2021
AU - Öncel, Püren
AU - Flynn, Lauren E.
AU - Sonia, Allison N.
AU - Barker, Kennis E.
AU - Lindsay, Grace C.
AU - Mcclure, Caleb M.
AU - Mcnamara, Danielle S.
AU - Allen, Laura K.
N1 - Funding Information:
This research was supported in part by IES Grants R305A180261 and R305A180144, as well as the Office of Naval Research (Grants: N00014-17-1-2300 and N00014-19-1-2424). Opinions, conclusions, or recommendations do not necessarily reflect the view of the Department of Education, IES, or the Office of Naval Research.
Publisher Copyright:
© 2021 ACM.
PY - 2021/4/12
Y1 - 2021/4/12
N2 - Automated Writing Evaluation systems have been developed to help students improve their writing skills through the automated delivery of both summative and formative feedback. These systems have demonstrated strong potential in a variety of educational contexts; however, they remain limited in their personalization and scope. The purpose of the current study was to begin to address this gap by examining whether individual differences could be modeled in a source-based writing context. Undergraduate students (n=106) wrote essays in response to multiple sources and then completed an assessment of their vocabulary knowledge. Natural language processing tools were used to characterize the linguistic properties of the source-based essays at four levels: descriptive, lexical, syntax, and cohesion. Finally, machine learning models were used to predict students' vocabulary scores from these linguistic features. The models accounted for approximately 29% of the variance in vocabulary scores, suggesting that the linguistic features of source-based essays are reflective of individual differences in vocabulary knowledge. Overall, this work suggests that automated text analyses can help to understand the role of individual differences in the writing process, which may ultimately help to improve personalization in computer-based learning environments.
AB - Automated Writing Evaluation systems have been developed to help students improve their writing skills through the automated delivery of both summative and formative feedback. These systems have demonstrated strong potential in a variety of educational contexts; however, they remain limited in their personalization and scope. The purpose of the current study was to begin to address this gap by examining whether individual differences could be modeled in a source-based writing context. Undergraduate students (n=106) wrote essays in response to multiple sources and then completed an assessment of their vocabulary knowledge. Natural language processing tools were used to characterize the linguistic properties of the source-based essays at four levels: descriptive, lexical, syntax, and cohesion. Finally, machine learning models were used to predict students' vocabulary scores from these linguistic features. The models accounted for approximately 29% of the variance in vocabulary scores, suggesting that the linguistic features of source-based essays are reflective of individual differences in vocabulary knowledge. Overall, this work suggests that automated text analyses can help to understand the role of individual differences in the writing process, which may ultimately help to improve personalization in computer-based learning environments.
KW - Individual differences
KW - Machine-learning models
KW - Source-based writing
KW - Vocabulary knowledge
UR - http://www.scopus.com/inward/record.url?scp=85103917987&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85103917987&partnerID=8YFLogxK
U2 - 10.1145/3448139.3448207
DO - 10.1145/3448139.3448207
M3 - Conference contribution
AN - SCOPUS:85103917987
T3 - ACM International Conference Proceeding Series
SP - 620
EP - 625
BT - LAK 2021 Conference Proceedings - The Impact we Make
PB - Association for Computing Machinery
Y2 - 12 April 2021 through 16 April 2021
ER -