Broadly defined, academic language (AL) is a set of lexical-grammatical norms and registers commonly used in educational and academic discourse. Mastery of academic language in writing is an important aspect of writing instruction and assessment. The purpose of this study was to use Natural Language Processing (NLP) tools to examine the extent to which features related to academic language explained variance in human-assigned scores of writing quality in a large corpus of source-based argumentative essays (n = 20,820) written by 10th grade students. Using NLP tools, we identified and then calculated linguistic features from essays related to the lexical, syntactic, cohesion, and rhetorical features of academic language. Consistent with prior research findings, results from a hierarchical linear regression revealed that AL features explained 8 % of variance in writing quality when controlling for essay length. The most important AL features included cohesion with the source text, academic wording, and global cohesion. Implications for integrating NLP-produced measures of AL in writing assessment and automated writing evaluation (AWE) systems are discussed.