The Pedagogical Effectiveness Index

Dr. Nish Sonwalkar (ScD, MIT) was first to introduce the pedagogical effectiveness index.

A New Methodology for Evaluation: The Pedagogical Rating of Online Courses

Article orignially published in the January 2002 issue of Syllabus Magazine.

In articles appearing in our November and December issues, Nishikant Sonwalkar examined the elements of online learning within the structure of a learning cube. Here, he proposes an instrument for evaluating online courses based on those elements.

Online course offerings are increasing in number every day. Most universities and corporate training facilities now offer some or all of their courses online. In fact, more than 1,000 corporate universities and online providers offer courses in everything from information technology to Chinese cooking. Although it is clearly advantageous for asynchronous learners to access educational information and content anywhere and anytime, it is difficult to evaluate the quality and effectiveness of online courses and learning modules.

Growing Need for Evaluation

Open source learning platforms and public access to online course content are increasingly popular because higher education can benefit from joint development efforts and shared resources, which ultimately reduce the cost of online learning. Consortia are sharing volumes of information and courseware, and several vendors are providing their technology as open source materials.

In the open source, open content environment we are entering, it is important to develop a common, objective scale and summative instrument with which to measure the pedagogical effectiveness of online course offerings.

Models of Evaluation

In my two previous articles in Syllabus (November and December 2001), I described the pedagogical learning cube (see Figure 1) in the context of instructional design. In this article, Ill again invoke the cube, including the five functional learning stylesapprenticeship, incidental, inductive, deductive, and discovery (x-axis); the six media elementstext, graphics, audio, video, animation, and simulation (y-axis); and the third axis of the cube (the z-axis), which represents the interactive aspects of learning.

the learning cube
Figure 1: The learning cube
Learning styles: L1 = apprenticeship; L2 = incidental; L3 = inductive; L4 = deductive; L5 = discovery

The z-axis indicates the degree to which students are engaged with the learning content, moving from a teacher-centric to a student-centered approach. This interactivity axis (z-direction) of the cube may be defined in terms of five elements: system feedback, adaptive remediation and revision, e-mail exchange, discussion groups, and bulletin boards. With this definition of the learning cube, a framework can be constructed to define pedagogy as a 3D space.

Pedagogical effectiveness is at the heart of online offerings and defines critical parameters for the evaluation of courses. However, learning management systems provide the essential integrative layer for online courses. If online courses are delivered in the context of learning management systems, several additional factors must be considered in any evaluation.

I propose a new instrument for overall evaluation, based on a five-factor summative rating system plus a pedagogy effectiveness index (PEI). The intent of the methodology described here is to create objective criteria for evaluating the quality of online courses based on the existing elements that represent pedagogical content.

The Pedagogy Effectiveness Index

Expanding on the above arguments, the pedagogical effectiveness of an online course can be defined as a summation of learning styles, media elements, and interactivity.

Assuming that each of those factors are equally likely and mutually exclusive, a probability distribution tree diagram (see Figure 2, page 19) can be shown to have three branches, with sub-branches represented for each axis of the pedagogical learning cube. A PEI can therefore be determined by a summative rule (see Figure 3, page 19). The corresponding probability multipliers can be shown in a simple matrix (see Figure 4).

Figure 2: The probability tree diagram for the pedagogical learning cube

Figure 3: The pedagogy effectiveness index expressed as a summative rule

Figure 4: Simple probability distribution matrix
Style Pi Media Pj Interaction Pk
Apprenticeship 0.068 Text 0.055 Feedback 0.066
Incidental 0.068 Graphics 0.055 Revision 0.066
Inductive 0.068 Audio 0.055 E-mail 0.066
Deductive 0.068 Video 0.055 Discussion 0.066
Discovery 0.068 Animation 0.055 Bulletin 0.066
    Simulation 0.055    
Total (weighted) 0.34   0.33   0.33

Consider the following cases as examples of the application of the PEI.

Case 1: The PEI for a course with one learning style, one media element, and one interactive element will be:

PEI = 0.068 + 0.055 + 0.066 = 0.189

Case 2: The PEI for a course with three learning styles, four media elements, and two interactive elements will be:

PEI = 3*0.068 + 4*0.055 +2*0.066 = 0.556

Case 3: The PEI for a course with five learning styles, six media elements, and five interactive elements will be:

PEI = 5*0.068 + 6*0.055 +5*0.066 = 1.0

These cases clearly illustrate that the PEI varies from 0 to 1. The probability of the pedagogical effectiveness increases as cognitive opportunity increases with the inclusion of learning styles, media elements, and interaction. The PEI is based on a simple probability distribution and should be considered an approximate indicator within the bounds of assumptions listed above, specifically relating to the flexible learning approach depicted by the pedagogical learning cube.

Summative Rating for Online Courses

The PEI serves as an indicator of the pedagogical richness of a course. However, online course delivery systems include several additional factors that affect the measure of success. Objective criteria for a summative evaluation should be applied in five major areas, including (1) content factors, (2) learning factors, (3) delivery support factors, (4) usability factors, and (5) technological factors.

These factors are evaluated with reference to the learning technology standards proposed by IMS, AICC, and SCORM.

Content factors. The content is the basis for course delivery and must be good to begin with. Mediocre content cannot be improved simply by infusing it with pedagogical styles or multimedia tools. It is important that an independent authority authenticates the accuracy and quality of the content. The source and author of the content must be given proper attribution to avoid copyright and compensation issues and to hold the author responsible for the contents quality.

Learning factors. The effectiveness of an online course depends on the quality of pedagogically driven instructional design. The learning factors at the core of the educational quality of an online course include concept identification, pedagogical styles, media enhancements, interactivity with the educational content, testing and feedback, and collaboration. Often, the objectives of a course are not well-defined. It is important that the instructional design is sensitive to the functional learning style that accommodates individual content sequencing and aggregation preferences.

Delivery support factors. The success of an online course depends heavily on the delivery support function essential for course instructors, administrators, and users. A software module should manage user authentication, portfolio information, and records of users activities throughout the course, as well as course content elementsincluding video streaming servers, audio servers, and the HTML server. Also, the federal government now requires colleges and universities to make access to online course content available to students with vision and hearing impairments.

Usability factors. Even if the quality of the content, pedagogical styles, and multimedia tools is high, an online course can be a complete failure if usability is poor. Users interact with online Web courses through a graphical user interface, so the design of graphic elements, the color scheme, the type fonts, and navigational elements can all affect how a course is organized and perceived by students.

Web pages that are loaded with information require excessive scrolling within a window and can be detrimental to the educational quality of the presentation. Design experts recommend presenting small chunks of information in 800×600-pixel windows. Page layout and ease of access from other parts of the course site are crucial to the success of an online course.

Technological factors. The issues that influence the technological success of online courses include available bandwidth, target system configuration, server capacity, browser software, and database connectivity. The network bandwidth defines what is the lowest common denominator for the course Web page. Designing for 56 kilobits/sec modem access has more limitations than designing for a T1 network connection of 1 megabit/sec. The number of simultaneous users a Web server can handle is also an important constraint for the large-scale deployment of online courses.

Designing courses to run via Microsoft Corp.s Internet Explorer vs. Netscape Communications Corp.s Navigator can make a difference in the kind of HTML 4.0 vs. JavaScript features that can be included. The choice also has an impact on the plug-ins that may be required to run interactive applications. Most large-scale online courses are powered by a database back end. The database connectivity and connection pooling mechanism can become a bottleneck if not dealt with properly.

Most rating systems are summative and depend on a precise definition of the quantitative scale. The most widely used rating system is the Likert scale, which I have selected for the proposed summative evaluation instrument (see Figure 5)

Figure 5: Summative evaluation instrument for rating online courses
No. Evaluation Factors Absent Poor Average Good Excellent
1 Content Factors 0 1 2 3 4
  Quality          
  Authenticity          
  Validity          
  Media          
  Presentation          
  Attribution          
2 Learning Factors 0 1 2 3 4
  Concept Identification          
  Pedagogical Styles          
  Media Enhancements          
  Interactivity          
  Testing and Feedback          
  Collaboration          
3 Delivery Support 0 1 2 3 4
  Factors          
  User Management          
  Course Content          
  Accessibility          
  Reporting          
4 Usability Factors 0 1 2 3 4
  Graphical User Interface          
  Interactive Design          
  Clarity          
  Chunk Size          
  Page Layout          
5 Technological Factors 0 1 2 3 4
  Network Bandwidth          
  Target System Configuration          
  Server Capacity          
  Browser Software          
  Database Connectivity          

An Overall Rating

The summative evaluation results (the sum of the ratings of all the factors in each of the five categories) and the PEI can be combined to give a final result that provides a view of the overall effectiveness of the online course:

Overall Rating = PEI x Summative Rating Score

The advantage of using the overall rating formula lies in the ability to incorporate the scores of both the pedagogical and delivery systems to provide a final rating that will be useful for comparing online course offerings.

The pedagogy effectiveness index and the summative evaluation instrument used in combination can be powerful tools for evaluating large numbers of online offerings. These criteria have a clear emphasis on pedagogically driven design. Widespread use of these tools could guide and motivate online education developers, universities, and training centers toward the creation of educational systems marked by measurable success.

Nishikant Sonwalkar is the principal educational architect at the Educational Media Creation Center at the Massachusetts Institute of Technology, and serves as the pedagogical advisor to Web-based educational experiments and projects.

Download the PDF of the original article.

MIT Enterprise Forum: Start-up Spotlight on IntellADAPT’s Brain-wave Adaptive Learning

On Wednesday, June 14, 2016, intellADAPT, a Boston based educational technology and Adaptive Learning Startup, was invited by the MIT Enterprise Forum to show case the first Brain Computer Interface (BCI) for STEM education at MIT Start-up Spotlight 2017 at Hatch Fenway in Cambridge, MA.

IntellADAPT demonstrated its Brainwave Adaptive Learning technology at the showcase with live demonstrations. The demonstration consisted of the use of an EEG headband to record brain activity while student is preparing for high stake examinations. Attendees were impressed by intellADAPT’s ability to determine the best way a person can learn within an hour. intellADAPT is excited about this ground-breaking technology and will be gearing up for its commercial implementation in the coming months.

IntellADAPT has developed brain-training equipment that allows students to prepare for a high stakes examination by identifying the best learning strategy and controlling exam anxiety by receiving real time neuro-feedback in practice examinations. Stay tuned for the product launch event soon.

The intellADAPT team would like to give a special thanks to Amy Goggins and Katja Wald for hosting this important event. For more information about intellADAPT and partnership opportunities, please visit our website at www.intellADAPT.com or contact the intellADAPT staff Swamini Shah at (617) 287-5791

Dr. Sonwalkar presenting at NSF Phase II grantees conference, June 5 to 7, Atlanta, GA.

Dr. Sonwalkar at intellADAPT booth

On Monday, June 5, 2017, intellADAPT, a Boston based educational technology and Adaptive Learning Startup, attended the National Science Foundation’s 2017 Phase II SBIR|STTR Grantee Conference in Atlanta, GA. At the event, intellADAPT showcased its Phase II funded Big Data analytics engine for brainwave adaptive technology.

Dr. Sonwalkar at intellADAPT booth
Dr. Sonwalkar presenting at NSF Phase II grantees conference, June 5 to 7, Atlanta, GA.

Attendees were impressed by intellADAPT’s ability to analyze large amounts of data to help students learn in more efficient ways. intellADAPT uses an EEG Headband to assess attention states and relate that information into differentiated learning strategies. intellADAPT is excited about this new piece of technology and will be gearing up for its commercial implementation in the coming months.

The intellADAPT team would like to give a special thank you to the National Science Foundation for serving as host for this important event.

For more information about intellADAPT and partnership opportunities, please visit our website at www.intellADAPT.com or contact the intellADAPT staff Nick Simmons at (617) 287-5791.

IntellADAPT attends Consumer Electronics Show in Las Vegas, NV

From January 4th to January 8thintellADAPT, a Boston based educational technology and Adaptive Learning Startup, attended the Consumer Electronics Show in Las Vegas, NV. At the event, intellADAPT showcased its brainwave adaptive technology in a live demonstration along side several other SBIR Awardees.

The demonstration consisted of live tracking expo attendees’ brainwaves while they read brochures and played a game developed by intellADAPT called Adaptive Memory Builder, available on the app store now. The demonstration showed that the attendees showed higher attention rates while playing the game that requires a great deal of focus.

Attendees were impressed by intellADAPT’s ability to assess attention states using the EEG headband and relate that information into differentiated learning strategies. intellADAPT is excited about this new piece of technology and will be gearing up for its implementation in the coming months.

The intellADAPT team generated a great deal of interest in the brainwave adaptive technology aspect of intellADAPT’s Adaptive Resource for the Classroom or ARC. intellADAPT is looking forward pursing possible partnerships though connections made during this event.

Dr. Nish Sonwalkar, ScD MIT, founder and CEO of intellADAPT, commented “Being able to attend this event is has been an honor and we are very appreciative of NSF for funding our participation in this event.”

The intellADAPT team would like to give a special thank you to the staff of CES and to NSF for funding the booth as a well as the Sands Convention Center for serving as host for this important event.

For more information about intellADAPT and partnership opportunities, please visit our website at www.intellADAPT.com or contact the intellADAPT staff at (617) 287-5791.

intellADAPT Attends Department of Education’s EDgames Expo in Washington D.C.

On Wednesday, December 14, 2016, intellADAPT, a Boston based educational technology and Adaptive Learning Startup, attended the Department of Education’ EdGames Expo in Washington, DC December 14 at Entertainment Software Association office in DC. At the event, intellADAPT showcased its brainwave adaptive technology at the Expo with live demonstrations.

The intellADAPT booth at edGames Expo
Photography by Cynthia Cephas

The demonstration consisted of live tracking expo attendees’ brainwaves while they played a game developed by intellADAPT called “Adaptive Memory Builder,” available at iTunes and Google Play The demonstration presented that the brain waves alpha, beta and gamma are good indicators of the attention stats. In general, attendees with higher attention (high alpha) scored better playing the adaptive memory builder game. This was the first brain-computer interface demonstration for evaluation of attention states created by virtual games in real time.

Attendees were impressed by intellADAPT’s ability to assess attention states using the EEG headband and relate that information into differentiated learning strategies. intellADAPT is excited about this new piece of technology and will be gearing up for its commercial implementation in the coming months. IntellADAPT is seeking strategic partnerships with the game developer to integrate brain-wave adaptive learning to improve education games.

Dr. Nishikant Sonwalkar wearing EEG headband
Photography by Cynthia Cephas

The intellADAPT team would like to give a special thank you to Edward Metz and the staff of Entertainment Software Association for serving as host for this important event.

For more information about intellADAPT and partnership opportunities, please visit our website at www.intellADAPT.com or contact the intellADAPT staff Nick Simmons at (617) 287-5791.

intellADAPT Attends New England Venture Summit in Dedham, MA

On the 6th and 7th of December, 2016, intellADAPT, a Boston based educational technology and Adaptive Learning Startup, attended the New England Venture Summit in Dedham MA. At the event, intellADAPT presented its brainwave adaptive technology to various Venture Capital firms.

The presentation, delivered by CEO Dr. Nish Sonwalkar, was on how education can be transformed by brainwave adaptive technology. The presentation focused on how intellADAPT’s patented brain wave assessment engine can dramatically boost learning outcomes at the k-12 and higher Ed levels.

Dr. Nishikant Sonwalkar wearing brain wave headband
Photography by Cynthia Cephas

The intellADAPT team generated a great deal of interest in the brainwave adaptive technology aspect of intellADAPT’s Adaptive Resource for the Classroom or ARC.

Dr. Nishikant Sonwalkar, founder and President of intellADAPT engaged with numerous Venture Capital companies present at the summit and was encouraged by the responses of many of them. Saying He said “We are integrating the first Brain-Computer Interface in our adaptive learning platform, that will give unprecedented opportunity to improve learning outcome in schools and college for STEM education.” IntellADAPT saw an overwhelming response from visitors and fellow participants that were convinced that they could use intellADAPT’s technology in their STEM programs to enhance learning outcome.

Attendees were impressed by intellADAPT’s ability to assess attention states using the EEG headband and relate that information into differentiated learning strategies. intellADAPT is excited about this new piece of technology and will be gearing up for its implementation in the coming months.

The intellADAPT team would like to give a special thank you to Joe Benjamin and his staff as a well as the Hilton Dedham, MA for serving as host for this important event.

For more information about intellADAPT and partnership opportunities, please visit our website at www.intellADAPT.com or contact the intellADAPT staff  Nick Simmons at (617) 287-5791.

intellADAPT Sponsors Massachusetts STEM Summit to Improve STEM Education Using Adaptive Solutions

On November 1, 2016, intellADAPT, a Boston-based educational technology and Adaptive Learning company, was one of the sponsors of the Massachusetts STEM Summit at the DCU Center in Worcester, MA. The intellADAPT booth at the MA STEM Summit presented adaptive learning solutions to increase STEM population amongst high-school and college students.

IntellADAPT, a company funded by National Science Foundation (NSF), SBIR grants, proudly showcased its patented Adaptive 2.0 Learning Technology that uses Big-data Analytics and Brain Computer Interface (EEG headband) to record brain activity of the users to determine their optimal learning strategy. At the intellADAPT booth, visitors received valuable information on how the ground-breaking adaptive learning technology works and how schools and universities can participate in a National Study on the use of Adaptive Learning for STEM Education, funded by the NSF.

Dr. Nish Sonwalkar, President IntellADAPT at Massachusetts STEM Summit 2016, DCU Center, Worcester.
Dr. Nish Sonwalkar, President IntellADAPT at Massachusetts STEM Summit 2016, DCU Center, Worcester.

The intellADAPT team generated a great deal of interest in the homework substitute aspect of intellADAPT’s Adaptive Resource for the Classroom or ARC. The first Physics ARC is deployed in numerous school systems participating in the national study conducted by intellADAPT.

Dr. Nishikant Sonwalkar, founder and President of intellADAPT engaged with numerous potential partner institutions and companies present at the summit and encouraged them to explore the opportunity of implementing intellADAPT’s adaptive STEM courses by participating in pilot programs offered in Fall 2016 and Spring 2017. He said “There is an acute need for developing STEM competency in schools and colleges to fill the growing unmet need for STEM jobs. We need to prepare next generation of engineers and scientists now to meet the challenge of future demand”. IntellADAPT saw an overwhelming response from visitors and fellow participants that were convinced that they could use intellADAPT’s technology in their STEM programs to enhance learning outcome.

The intellADAPT team would like to give a special thank you to Dana Bienkowski and Lynn Griesemer for organizing this conference and allowing intellADAPT to demonstrate adaptive learning solutions for schools and colleges.

For more information about intellADAPT and partnership opportunities, please visit our website at www.intellADAPT.com or contact the intellADAPT staff at (617) 287-5791.

IntellADAPT team presenting adaptive learning solutions at the Exhibit Hall of Massachusetts STEM Summit 2016.
IntellADAPT team presenting adaptive learning solutions at the Exhibit Hall of Massachusetts STEM Summit 2016.