American Journal of Educational Research
ISSN (Print): 2327-6126 ISSN (Online): 2327-6150 Website: Editor-in-chief: Ratko Pavlović
Open Access
Journal Browser
American Journal of Educational Research. 2019, 7(8), 583-590
DOI: 10.12691/education-7-8-7
Open AccessArticle

Creating a 21st Century Skills Survey Instrument for High School Students

Todd R. Kelley1, , J. Geoff Knowles2, Jung Han1 and Euisuk Sung3

1Technology Leadership and Innovation, Purdue University, West Lafayette, IN, USA

2Ivy Tech Community College, Lafayette, IN, USA

3Science Education, Indiana University, Bloomington, IN, USA

Pub. Date: August 22, 2019

Cite this paper:
Todd R. Kelley, J. Geoff Knowles, Jung Han and Euisuk Sung. Creating a 21st Century Skills Survey Instrument for High School Students. American Journal of Educational Research. 2019; 7(8):583-590. doi: 10.12691/education-7-8-7


This article describes the development of a 21st century skills instrument for high school students. The first round of development of the instrument was crafted from four rubrics created to assess communication, collaboration, critical thinking, and creativity within project-based learning (PBL) activities. After an exploratory factor analysis, the pilot study results revealed multiple survey items loading across multiple factors requiring a revised instrument. The research team revised the instrument and added more items by using language from P21 standards documents. The revised 21st century skills instrument of 50 items was administered to 276 high school students participating in a STEM program. The final Exploratory Factor Analysis yielded a total of 30 survey items loaded across the four subscales with strong internal consistency within the constructs. This instrument can be used as a baseline and achievement measure for high school students’ 21st century skills.

21st century skills instrument development exploratory factor analysis survey

Creative CommonsThis work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit


[1]  Partnership for 21st Century Skills [P21]. (2009). P21 framework definitions. Retrieved July 10, 2019 from:
[2]  Organization for Economic Cooperation and Development [OECD]. (2005). The definition and selection of key competencies: Executive summary. Paris, France: OECD.
[3]  American Association of School Librarians (2018) AASL Standards for the 21st Century Learner.
[4]  National Governors Association. (2010). Common core state standards. Washington, DC.
[5]  Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3(1), 11.
[6]  Lave, J., & Wenger, E. (1991). Situated learning. Legitimate peripheral participation. Cambridge, England: Cambridge University Press.
[7]  National Academy of Engineering and National Research Council [NAE & NRC]. (2009). Engineering in K-12 education: Understanding the status and improving the prospects. Washington: National Academies Press
[8]  National Research Council [NRC]. (2012). A framework for K12 science education: Practices, cross cutting concepts, and core ideas. Washington: National Academies Press
[9]  Boss, S. (2013). PBL For 21st Century Success: Teaching Critical Thinking, Collaboration, Communication, and Creativity. Buck Institute for Education: Novato, California.
[10]  Buck Institute for Education. (2019). 9-12 Presentation Rubric. Retrieved from
[11]  Paulhus, D. L., & Vazire, S. (2007). The self-report method. Handbook of research methods in personality psychology, 1, 224-239.
[12]  Pett, M. A., Lackey, N. R., & Sullivan, J. J. (2003). Making sense of factor analysis: The use of factor analysis for instrument development in health care research. Sage.
[13]  Thompson, B. (2007). Exploratory and confirmatory factor analysis: Understanding concepts and applications. Applied Psychological Measurement, 31(3), 245-248.
[14]  Cerny, B. A., & Kaiser, H. F. (1977). A study of a measure of sampling adequacy for factor-analytic correlation matrices. Multivariate behavioral research, 12(1), 43-47.
[15]  Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research & Evaluation, 10(7), 1-9.
[16]  Litwin, M. (1995). How to Measure Survey Reliability and Validity. Sage Publications: Thousand Oaks, California.
[17]  Browne, M. N., & Keeley, S. M. (1998). Asking the right questions: A guide to critical thinking. (5th Ed.). Upper Saddle River, NJ: Prentice Hall.
[18]  Burns, N., & Grove, S. K (1993). The practice of nursing research conduct, critique and utilization.
[19]  Mason, E. J. & Bramble, W. J. (1997). Research in education and behavioral sciences. Chicago: Brown & Benchmark Publishers.
[20]  DeVellis, R. F. (2003). Scale development: Theory and application. London: Sage Publishing.
[21]  Levesque-Bristol, C., & Cornelius-White, J. (2012). The public affairs scale: Measuring the public good mission of higher education. Journal of Public Affairs Education, 18(4), 695-716.
[22]  Brown, J.S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42.
[23]  Salant, P., & Dillman, D. A. (1994). How to conduct your own survey. New York: John Wiley and Sons.
[24]  Stronge, J.H., Grant, L.W., & Xu, Xianxuan (2017). Designing effective assessments, Solution Tree Press: Bloomington, IN.
[25]  Wah Chu, S. K., Reynolds, R.B., Tavares, N. J., Notari, M., Yi Lee, C. W. (2017). 21st Century Skills Development through Inquiry-Based Learning: From Theory to Practice. Springer Nature: Singapore.
[26]  Williams, B., Onsman, A., & Brown, T. (2010). Exploratory factor analysis: A five-step guide for novices. Australasian Journal of Paramedicine, 8(3).