Do I know any better?

_ID300dpiWhen asked to define the field of instructional design and technology early in my course of study, I suggested that one needed to look at the functional role of the ID professional as one who helps to uncover unmet needs, translate gaps into learning initiatives, and guide the development and implementation process.  I would add to that that the ID professional is also an evaluation expert and troubleshooter when the the evaluation finds that there were some misses or that the training came up entirely too short. I had additionally suggested that IDT is the systematic process of creating learning experiences utilizing various technologies, platforms, and settings.  I still feel that this is true but would expand my definition to include that IDT can be a means of facilitating knowledge transfer through these means.

Through my reading since then, I’ve learned that the field of instructional design and technology has a much broader reach than I initially figured.  Whether the venue for IDT is within healthcare, K-12 through higher education, corporate and industrial, or military and whether it’s in the US or other countries of the world, processes may change and various factors may influence the outcome, but the systematic approach seems to be a universally accepted tenet of IDT.  Furthermore, practitioners may disagree over the level of guidance in the learning milieu as well, but again, there’s the basic agreement over the field as one that uses a system and seeks to optimally match and utilize technology (or not to use it at all) to increase the ability of learners to actually learn and apply their learning within the non-learning environment.

So in all, my definition of instructional design and technology is that it is a field whose focus is on understanding how people learn and then utilizing a systematic and cyclic process of analysis, design, development, implementation, and evaluation of authentic opportunities to learn (appropriately utilizing — or not — technological tools), improve performance, or facilitate knowledge transfer such that a behavior or knowledge can be genuinely demonstrated in real-world situations.

That may very well be the longest single sentence I’ve ever written.  🙂

 

 

 

Who threw that pebble in my pond? … my read of Merrill

Background

In M. David Merrill’s 2002 classic paper, he proffers an alternative ISD approach to ADDIE as a means to address some of the shortcomings of the traditional model.  Some of those shortcomings include that it can often be too cumbersome and slow and that it may be dated in terms of modern design needs.  He further suggests that the process can be criticized for its adherence to a step-wise process that may not be implemented in the most stringent or high-quality fashion as it could be and, therefore, the failings of instruction produced through such a system indeed may miss the mark.

While a variety of ISD methods may be in use, Merrill argues that there are core “first principles” upon which different models agree.  These first principles encompass the general phases of effective problem-centered instruction: activation, demonstration, application, and integration.  Specifically, Merrill’s First Principles in the article are posed as a series of questions as follows:

  • Is the courseware presented in the context of real-world problems? Are learners shown the problem, engaged at the task as well as the operation level, and involved in a progression of problems?
  • Does the courseware attempt to activate relevant prior knowledge or experience? Are learners directed to recall relevant past experience or provided relevant experience? Are they encouraged to use some organizing structure?
  • Does the courseware demonstrate what is to be learned rather than merely telling information about what is to be learned? Are the demonstrations consistent with the instructional goals? Is learner guidance employed?  Do media enhance learning?
  • Do learners have an opportunity to apply their newly acquired knowledge or skill? Is the application consistent with the instructional goals, and does it involve a varied sequence of problems with feedback?  Are learners provided with gradually diminished coaching?
  • Does the courseware provide techniques that encourage learners to integrate (transfer) the new knowledge or skill into their everyday life? Do learners have an opportunity to publicly demonstrate their new knowledge, reflect on their new knowledge, and create new ways to use their new knowledge?

Pebble-in-the-Pond versus ADDIE

In the Pebble-in-the Pond (PITP), the content is central to the process as it addresses or is based on these “first principles.”  Shown in Figure 1 below, the approach’s starting point is the center, the pebble tossed into the pond, the real-world problem or task to be learned.  The next step is to identify a progression of tasks or problems of increasing difficulty such that if learners master each, they will have mastered the problem or the skill/knowledge to be taught.  The third involves describing the component knowledge or skill in the progression.  In the fourth ripple, the instructional strategy is described to teach each of the components.  The strategy relies on using a diminishing amount of learner guidance as mastery at each step in the instruction is displayed.  In the fifth ripple, the interface is designed, with content adapted to the delivery system.  The final stage is production, which Merrill states, he prefers to the ADDIE term of “development,” and the content is specified and produced.

The key difference between PITP and ADDIE is in the content centeredness and the treatment of the problem as a whole.  However, as Merrill states, the PITP approach “assumes that the designer has already identified an instructional goal (not detailed objectives) and described the learner characteristics.”  Generally speaking, PITP combines the analysis and design steps, asking designers to consider the tasks that need to be accomplished, and how those tasks can be broken up into “part-tasks” that can then each be tackled in turn, similar to scaffolding.

There are three general ways in which ADDIE and PITP differ.  First, in ADDIE, the instructional objectives are identified early in the process, and Merrill notes that these are “abstract representations of the knowledge to be taught rather than the knowledge itself.”  Second, by their nature, these abstract objectives are sometimes changed or abandoned once successive steps in the design process are completed.  Third, Merrill argues that the PITP model is a more efficient model than traditional ADDIE ISD approaches in that it develops content first.

 Study Example

In his study, Merrill used the PITP method to design a Microsoft Excel course, with increasingly difficult problems to solve and decreasing amounts of instruction.  As proof of the model’s success, he compared the performances of 128 students across the same three problem scenarios using three different instructional methods, his, a commercial eLearning course which taught the commands and operations in a guided demonstration, and a control condition wherein there was no prior instruction to tackling the problems (although they did have access to an FAQ).  The group utilizing the PITP designed instruction outperformed the guided demonstration group, and, not surprisingly, both groups outperformed the control.

Reflection

In reflecting on Merrill’s paper in general, I get the sense that PITP is not necessarily that different from ADDIE.  While it does focus on content first, I feel that the performance of the learners could be used to feed-back to the design and strategy steps such that content and strategy could be continuously improved.  Merrill did not state that this was necessarily a terminal point in the process, but it’s the sense I got in proving the method a superior one.  He did suggest that one needs to really know what they’re doing with the tools and methods available to them, so the failings of an ADDIE (or even his PITP) process could be in its implementation rather than in its design or appropriateness to the course being created.  I also feel that with its focus on the problem as a series of tasks that it is not unlike the task analysis steps involved in ADDIE models.

Additionally, I’m not sure that the proof he decided to use was necessarily a fair comparison.  I think a more fair comparison would have involved equal design teams using both methods to design a training and then test constituents on the same problems.  My guess is that both methods would have produced viable courses with similar success levels.  I think failing to match the empirical method to the hypothesis one makes too often results in a false proof.  I’m perfectly willing to admit that I read the paper wrong, though, since Dr. Merrill is a revered figure in the field and his study was published in a peer reviewed journal.

 

 

References

Merrill, M.D. (2002). A pebble-in-the-pond model for instructional design.  Performance Improvement.  41(7).41-46.

MOOCs – did we expect too much too soon?

In Pranab Chakraborty’s blog post on TD.org, he questions whether we’re just at the beginning of the MOOC revolution, whether it’ll change the way people learn, or if there’s already a waning in the attraction of MOOCs.  He further questions whether we’ve really just expected too much too soon.

The New York Times had dubbed 2012 the “Year of the MOOC,” In fact, the New York Times dubbed 2012 “the year of the MOOC,” and quoted edX president, Anant Agarwal’s prediction that “a year from now, campuses will give credit for people with edX certificates.”  This grandiose statement reminded me of the infamous Edison statements about how books would be displaced in the classroom by the use of instructional radio in the 20s and 30s.

The initial focus on Mr. Chakraborty’s post is the use of MOOCs in higher education and their inability to disrupt higher ed’s business model.  He cites the failure of MOOCs to establish a job placement network akin to those in higher ed institutions, which results in the lack of immediate hiring opportunities for students who attain MOOC certifications.  While providing knowledge and skills, MOOCs don’t grant the all-important college degree that makes for a more attractive job applicant and additionally fails to offer the intangibles of college life that attracts students to brick-and-mortar institutions.

However, he further states that while MOOCs may not displace those brick-and-mortar institutions, they do hold an attraction to lifelong learners and those in search of greater depth and breadth of instruction for a number of topics.  In addition, there is a movement toward the use and creation of MOOCs for nanodegrees, which are sponsored programs of instruction with companies (such as Google, Salesforce, etc.)  to create and demo apps and other work products for the sponsoring companies while mutually benefiting the student as well.

While MOOCs are not displacing or disrupting higher ed’s business model, there have been some colleges that have been exploring the use of MOOCs for other “try it before you buy it” purposes.  For example, a model that has been recently taking hold is the use of MOOCs as a means to draw students into online MBA programs in which they may not have previously qualified and students can try out courses without committing and applying later if they feel confident enough to do so.

In all, while MOOCs may not be taking over the world of higher education, they are finding their niche.  Corporate learners are turning increasingly to MOOCs to gain new skills and knowledge without the price tag of traditional brick-and-mortar schools.  Without doing additional research, I feel that the use of MOOCs in developing countries can be valuable for ready dissemination of learning material, and that they may have a place for regions of the US and even job roles where the bias towards a college degree may be less.

Design in Instructional Design

I’ve been putting in a several very long weeks at work recently.  My job encompasses many responsibilities around the coordination of assessment center and L & D initiatives.  But of late, it seems the major demand on my time is in the creative work around designing engaging materials for training and development.  Specifically, since making a TED-style presentation for my boss, my presentation design skills have increasingly been in demand by consultants across our company.  Being in one of those “fast-paced” corporate environments, finding the time for creative design is difficult.  And it got me thinking about the creative design aspects of instructional design.

For those instructional designers fortunate enough to be surrounded by a team of graphic designers who can tackle the creative production issues, this may never be a concern.  However, in smaller companies like mine, one needs to wear many hats, including that of the “look and feel” architect, and creativity takes time!

I did a quick web search on “putting the design in instructional design” and was not surprised to see that other people have written on this.  In particular, Kineo VP of Learning, Cammy Bean, who is a frequent conference presenter (featured at this winter’s ATD Tech Knowledge conference) and author of “The Accidental Instructional Designer,” notes that there are 3 components to design.  Purpose suggests that every element of the design contributes to achieving a goal. Intention is the thoughtful consideration and placement of every element and all of the details.  It’s why one selects the font, where buttons are located, the color scheme, etc.  Rather than going about the design in a haphazard manner, it’s making each decision with a specific reason in mind.  For training, it’s thinking how each piece engages the learner.  Lastly, content, provides the framework or the lens through which the design is focused.  What you’re communicated and why you’re including it.

I’m somewhat relieved to have read these tidbits.  I can definitely use it in my defense when my boss is on my case as deadlines loom!  Clearly I’ve been working with purpose on intentional content, and all of that takes a lot of time!

 

Rapid Prototyping Instructional Design: Revisiting the ISD Model – Article Review

Abstract

A mixed-methods investigative study by the authors, Jenny Daugherty, Ya-Ting Teng, and Edward Cornachione, was presented as a paper at the 2007 International Research Conference in The Americas of the Academy of Human Resource Development.  The study examined the perceived quality and client experience while utilizing a compressed process of instructional design, referred to as “rapid prototyping.”  Findings support the notion that rapid prototyping is a viable approach to creating high-quality instruction and can also enhance the level of client satisfaction and buy-in in the process.

Background

Working in a corporate environment, I have been involved in a number of training and development initiatives that involve the creation of courses of study to address a variety of soft skills in the business environment.  As a company that acts as a vendor to our client companies, my firm is frequently tasked with creating high-quality programs in a relatively short period of time.  Through my brief and cursory study of ID&T thus far, we have been studying the systematic instructional design process as proposed by Dick and Carey (1978).  The reality of the corporate world sometimes does not allow as deep and as linear a process as the study model.  Curiosity over a more compressed approach led me to the reviewed article.  Though there may be more recent articles which are worthy of exploration, this first foray into the topic only further kindles my curiosity over empirical studies of such compressed approaches to instructional design.

Criticisms of Traditional ISD Models and Rapid Prototyping Defined.  The authors cite a number of sources of criticism of traditional ISD models, suggesting that their application in corporate and school settings can be difficult.  The models and their processes can be viewed as inflexible, too linear, too slow to implement, and overly analytical.

A rapid prototyping (RP) approach can be seen as a way to address some of these criticisms.  The intention of RP is to provide a nimble process which reduces time and cost issues while increasing client involvement.  RP utilizes an iterative, overlapping ADDIE process.  The authors cite Piskurich (2000) who described RP as a “continuing process, with new aspects being added and evaluated in this mode each week until you finally have a complete program.”  With RP, one can produce a working model or an actual structure of the design in a prototype fashion, which can be evaluated by the client (and learners and designers) and revised as necessary before much time and expense has been put into a process that may not eventually be acceptable to the client.  (From my own experience, this is an approach that is more akin to how instructional programming has been accomplished in my firm.)  Figure 1 details an example of an RP approach as envisioned by Tripp and Bichelmeyer (1990).

Fig 1

Figure 1. A Model of Rapid Prototyping Applied to Instructional Design.  Taken from Tripp, S., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research & Development, 38(1), 31-44.

Study Method

The authors used a study group comprised of a client who was also to be the instructor of the material, and a class of forty undergraduate students taking part in a leadership development program.  The client also served as subject matter expert who worked along with a design team of three instructional designers, a supervisor, and two staff members.  The rapid prototyping (RP) process resulted in the development and refinement of a working model of the instructional product.  A test session was conducted with the client facilitator delivering the material.  Data were collected through multiple collection sources (interviews, observations, surveys, and audiovisual recording) and included ratings of the quality of instruction, the design process, and follow-up ratings from the facilitator.

The aims of the study were to answer three questions:

  1. To what extent does RP impact the quality of the training product?
  2. To what extent does RP impact the role of the client?
  3. To what extent does RP impact the usability and customization of the training product?

Summary of Findings

The authors utilized field observation of the process and instruction to gauge real interactions and challenges.  Observations centered around the delivery and the content to the instruction, which led to refinements in the training product from the initial prototype.  Timely changes to materials and methods were incorporated to the final product, with input from the facilitator/client.

The authors utilized a feedback survey based on Kirkpatrick’s first level of training evaluation (participant reactions) and noted that the only low scores were attained in the area of the facility’s appropriateness.  Generally high scores were achieved for content, methods of instruction, and materials.  Participants did additionally note some confusion over the main concepts of the training and a lack of interaction with others, which the designers addressed by allotting learners more time for small group discussion.  The facilitator/client additionally modified the order of the training to address some of these issues.  The authors note that because the client was heavily involved in the instructional design and content of the session, that the client was well-equipped to respond quickly to learners’ needs.

The observations and feedback garnered throughout the test session and process of design led the team to further make refinements in content and delivery format to better address learners’ needs and criticisms.  Alternate training formats, reordered topics, content revisions, and pre-work were a few of the modifications that the team and the client agreed upon in producing the finalized delivery product.

Study Conclusions

The authors posit that “informed decisions on alternative ways of combining resources to reach similar or even a better level of outcomes is likely to be one of the top priorities in the future for Human Resource Development.”  By utilizing a process of RP, a working model of the final product was deployed early in the design process, which allowed for revisions concurrent with the other tasks of the traditional design process (setting objectives, evaluating the design, and continually improving the design and instructional methods).  The study’s findings support the notion that RP can provide a viable practice to meet tight timelines and still produce high-quality instructional outputs.

Responding to the study’s questions, the authors noted the following:

  1. To what extent does RP impact the quality of the training product?
    1. Based on the perspectives of the learners, designers, supervisors, and the client facilitator the quality of the final training product was deemed to be very high.
  2. To what extent does RP impact the role of the client?
    1. The client expressed an overall high level of satisfaction with the process, from design to evaluation. Due to the high level of involvement throughout the process, the client expressed a high level of ownership and satisfaction with the final product.
  3. To what extent does RP impact the usability and customization of the training product?
    1. The authors cite Tripp and Bichelmeyer (1990) as having stated that the RP process allows content and method to be adapted to any learner’s needs. This high level of adaptability is one of the main benefits of the RP process.  The study showed that the RP process employed allowed the designers and client facilitator to readily adapt the instruction to a variety of needs.

Final Thoughts

Based on the design of the study, the authors seem to agree that RP can be a timely and flexible manner to develop instructional solutions to training needs.  However, a study’s utility is only as good as the data it is based on.  Thus, while utilizing Kirkpatrick’s first level or evaluation as the basis for the outcome measures is likely acceptable for the purposes of this study, it would be interesting to further evaluate the levels of learning to further address issues of context and depth of learning, providing a more robust measure of the quality of the final product.  Also, that the client was so involved in the process and then in the evaluation of the process, I believe leads the client to only be able to respond in the affirmative direction when asked about outcomes.  That is, someone who is so invested in a process or a product may not be likely to provide low ratings on these aspects.  Lastly, I would be interested to read additional studies which can validate the findings of this study to provide further evidence of the value of an RP process in instructional design.

References

Daugherty, J., Teng, Y., & Cornachione, E. (2007). Rapid Prototyping Instructional Design: Revisiting the ISD Model. Paper presented at the International Research Conference in The Americas of the Academy of Human Resource Development (Indianapolis, IN, Feb 28-Mar 4, 2007). 8 pp

Dick, W., Carey, L., & Carey, J.O. (2015). The Systematic Design of Instruction (8th ed.). Upper Saddle River, NJ: Pearson.

Tripp, S., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research & Development, 38(1), 31-44.

IDT though my eyes

xyleme-learner-teacher-instructional-designTo define what IDT is to me is to focus on the functional role of the instructional designer in a corporate setting.  That is, the ID professional helps to uncover unmet needs, translate gaps into learning initiatives, and guide the development and implementation processes of these initiatives. IDT is the systematic process of creating learning experiences utilizing various technologies, platforms, and settings.  It is typically collaborative, involving SMEs, decision makers, and business leaders.

It’s an exciting time to develop the skills and knowledge of an IDT professional and to join the field. Change is in the air. We’re experiencing an incredible phenomenon in workplaces across the country as the Boomer generation is retiring, Gen Xers are reaching the apex of their careers, and Millennials are trying to carve out their niche.  The pace of technological change is reaching warp speeds. These forces combine to create a need for companies to adapt their talent development practices in order to fill gaps in knowledge and skill and to meet their employees’ desires to continually learn and grow. The evolution of the workforce brings with it a need for companies to explore, evolve, and invest in learning and development initiatives.

My future IDT job is an evolution of my current job.  It’s my wish to work within the talent management company I’ve been working with for more than 15 years to help to design and create best-in-class learning opportunities for our client companies through multi-modal (classroom style instructor-led, webinar, and self-paced eLearning/LMS) L&D solutions that address the gaps and help to translate business requirements into measurable skill and developmental improvements within the workforce.

Links to blogs and sites about IDT that I find helpful and interesting:

http://elearningindustry.com/

https://www.td.org/Publications/Blogs/Learning-Technologies-Blog