Selasa, 08 November 2016

Makalah BAB 3&4 ( Dosen: Dr. Dirgantara Wicaksono, M.Pd )


 Image result for logo fip umj
 Establishing Performance Objectives and Performance Measurement and Delivering The Instruction Effectively
Lecturer : Dr. Dirgantara Wicaksono, M.Pd
By:
Damayanti Celara
Nanda Aulia Rachman
Indy Kartika Sari
Serli Nurjanah T.A
Chairunnipah
Deza Rahayu
Anggun Pratiwi
STUDY PROGRAM OF PRIMARY TEACHER EDUCATION
FACULTY OF EDUCATION
UNIVERSITY OF MUHAMMADIYAH JAKARTA
2016
DELIVERER WORD
Praise and thank god writer climb it to God That Most One on bless and Its am-mercy until writer can to finish it so is titly ones  Establishing performance objectives and Performance Measurement and Delivering The Instruction Effectively  correct to its time.
             As for aim from writting so is this is to to study a steps inside process  designing instruction, and to choose material teach.
             To occasion this, wanting writer to submit it thanks to all the side that had gave supporting morale also materil until so is this can finish.
            Although had efforty to finish it so is this is as good as maybe, writer realized that so is this still there is substraction. By because that, writer expect it criticism and the suggest who wake up from the readers use to perfect it all substraction in arranging so is this.
          .
Jakarta, 29 September 2016
                                    writer
INTRODUCTION
A.               Back Background Problem
Education world is place where someone can to dig or find the potency to through experience study. in teacher's school as fasilitator, demonstrator also motivator. for that education is aspect a the personal forming concept child. by because that teacher must to own potency as teacher energy.
All of in study. in this matter of course teacher's performa in class very important to be watched. hooked the mentioned matter, method, steps, studying media, and the necessary education management in ready it also ripe to its reacher of aim education.
B.                 Problem Formulation
1.      What Is Establishing Performance Objectives?
2.      How To Delivering The Instruction Effectively?
3.      How An Overview Of Steps Selecting Or Designing Instructional Materials?
C.                Purpose
1.       Student can to know about estabilishing performance objectives.
2.       Teacher can to know pf steps selecting or designing instructional materials.
A.                Establishing Performance Objectives and Performance Measurements
1.                  Distinguishing performance objectives from goals and activities
Performance objectives should niot be confused with goals or activities. Instructional goals are simply expressions  goals are simply expressions of the general results desired from instruction. Unlike performance objectives, they are not measurable. In a famous explanation, calls them warm fuzzies because they sound desirable (warm) but are so vague (fuzzy) that achieving them is unclear. In fact, different peoply may assign their own meanings or significance to them. Examples of instructional goals are easy enough to point out and include such lofty efforts as “inprofing customer sarvice, “inprofing quality, incereasing profitability, and “increasing learner under standing.
a.            Deriving performance objectives from goal analysis and task or content analysis
Instrucsional designers can derive performance objectives from goal analysis, carried out with instructional and organizational goals and learner-trainer activities, or from task or content analysis results.
1)                  Defining Goal Analysis
Goal analysis is a means of transforming laudable but otherwise vague desires into specific and measurable targets for learner accomplishment 9Mager, 1997a). Goal analysis is appropriate to use on those many occasions when instructional designers are approached by their clients to work miracles. Clients often speak in terms of vague and ill-defined goals, and instructional designers must use methods such as performance analysis to decide what kind of performance problem exists. Goal analysis is a later step, intended to determine precisely what measurable results are desired from an instructional design solution.
2)                  Performing goal analysis
a)      Identify the goal, the warm fuzzy, and write it down.
b)      Write down examples of what people are saying or doing when they are behaving in a way corresponding to the goal.
c)      Sort out unrelated items and polish the list developed
d)     Describe precisely
e)      Test the performance objectives
3)                  Converting results of task or content analysis into performance objectives
Instructional designers convert the results of task or content analysis into specific performance objectives by:
a)      Establishing instructional purpose
b)      Classifying learning tasks
c)      Analyzing learning tasks
Instructional designers should bear in mind that the appropriate way to carry out the instructional design process depends on the results to be achieved. Classifying work tasks into learning tasks is important because it can suggest the best ways to design instruction that is intended to bring about particular results. Of course, more than one classification scheme for work or for learning tasks or content has been devided.
Instructional designers should bear in mind that the appropriate way to carry out the instructional design process depends on the results to be achieved. Classifying work tasks into learning tasks is important because it can suggest the best ways to design instruction that is intended to bring about particular results. Of course, more than one classification scheme for work or for learning tasks or content has been devised.
4)                  Linking work activities and performance objectives
Performance objectives must always be tied to work activities. However, they may be linked to different expressions of work activities, for instance,as work tasks are performed or as they could be more efficiently and effectively performed at present or in the future.
b.            Stating objectives in performance terms
Instructional designers should describe the desired results of instruction in performance based terms. They should be able to classify the type of performance objectives that must be written and then state performance objectives that are directly or indirectly linked to work requirements.
1)                  Classifying performance objectives
Instructional designers begin the process of stating performance objectives by identifying the kinds of objectives that must be written. Referring to the task classification prepared earlier in the instructional design process, they should clarify whether each objective will focus on knowledge, skills, or attitudes.
2)                  Describing parts of performance objectives
Performance objectives make tangible a vision of what learners should know, do,or feel at the end of a planned instructional experience.
1.      Performance
2.      Criterion
3.      Condition
3)                  Avoiding common mistakes in writing performance objectives
Writing performance objectives is more difficult that it may appear at first blush. Some mistakes are relatively common.
a)      Avoid making objectives long-winded.
b)      Do not use vague language.
c)      Try to avoid descriptions of criteria that are linked to instructor (or supervisor) satisfaction, as in the phrase “ will perform to the satisfaction of the instructor.”
d)     Avoid lengthy “laundry lists” of required equipment and other resources when describing the conditions necessary for performance.
c.             Judging performance objectives
Instructional designers should be able to evaluate the performance objectives written by them selves or others.
d.            Judging and justifying performance objectives
Instructional designers should be capable of explaining why they have written performance objectives the way they have.
e.             Acting ethically in writing performance objetives
A key ethical issue in writing performance objectives can be expressed by this question, do the performance objectives written match up to the performance expectations of the job, task, or content that was analyzed? The key ethical issue in writing performance objectives is thus to ensure that performen objectives of instruction, when realized, will effectively meet job, task, or content requirements.
DEVELOPING PERFORMANCE MEASUREMENTS
Instructional designer should usually develop performance measurements during or immediately following the preparation of performance objectives. Measurements of all kinds-sometimes called metrics-have been commanding some attention in recent years(Brown, 1999; Hatten and Rosenthal, 2001).
A.                Developing Performance Measurements
Instructional designers should be capable of developing tests, written questionnaires, interviews, and other methods of measuring performance. The performance measures should be written clearly and correspond to performance objectives, rely on appropriate methods of measuring learning outcomes, comply with time and intructional constraints, and meet requirements for validity and reability. Instructional designers should be able to develop performance measurements when they are furnished with necessary information on the characteristics of learners, the settings in which they are expected to perform, constraints on performance and instructional development, instructional objectives, and plans for analyzing needs and evaluating results as applicable.
1.                  Deciding on the Purpose
Instructional designers should always begin by clarifying their purposes for measuring performance. There are at least four possible purposes (Kirkpatrick, 1996):
a.         Participant reaction
b.        Participant learning
c.         On-the-job performance change
d.        Organizational impact
2.                  Determining Sources of Information
After determining the purpose of performance measurement, instructional designers should next determine the sourcesof information that will be used in measurement. There are three major sources of information. Performance objectives are the first.
Issue:
a)         Content of instruction
b)         Method of indtruction
c)         Amount of learning
d)        Indtruction skills
e)         Lenght and place of indtruction
f)          Objectives
g)         Omissions
h)         Learning transfer
i)           Accomodation
j)           Relevance
k)         Application of learning
l)           Efficiency
m)       Hindsight
B.                 An Overview of steps in Preparing Instrument
There are ten basic takes to be taken during the preparation of a measuremenet instruent.
1)            Clarifying the purpose of measurement and selecting a type of instrument
2)            Giving the instrument a descriptive title
3)            Conduction background research
4)            Drafting or modifying items
5)            Sequencing-or reviewing the sequence of-items
6)            Trying out the instrument on a small-group representative of the learner population
7)            Revising the instrument based on the small-group tryout
8)            Testing the instrument on a larger group
9)            Using the instrument-but establishing a means of tracking experience with it
10)        Revising the instrument-or specific items-periodically
         There steps are summarized in the following paragraphs.
Step 1 : clarifying the Purpose of Measurement and Selesting a Type of Instrument
                        Instructional designers should start developing performance measurements by thingking through exactly why they are measuring instruction and, more important, what results they wish to achieve.
Step 2  : Giving the Instrument a Descriptive Title
                        If performance will be measured using an isntrument developed by someone else, instructional designers should consider the title to see if it accurately describes what they wish to measur.
Step 3  : Conducting Background Research
            Instructional designers can often save themselves considirable time and effort by locating previously prepared instruments. 
Step 4  : Drafting or Modifying Items
            Relying on instructional objectives or other sources as a starting point, instructional designers should next decide what questions they need to ask to measure the changes wrought by the instructional experience. If a previously prepared instrument was located, each item must be reviewed to ensure that it is appropriate.
Step 5  : Sequencing-or Reviewing the Sequence of-Items
One choice is to sequence items in a logical order based on work tasks. Another choice is to sequence items according to a learning hierarchy.
Step 6  : Trying Out the Instrument on a Small-Grup Representative of the Learner Population
                        Sometimes called instrument pre-testing, this step should not be confused with learner pre-testing. If possible, instructional designers should select a sample of people representative of the learner population to participate in the instrument pre-test and ask for their help in indentifying wording that is unclear or is otherwise inappropriate.
Step 7  : Revising the Instrument Based on the Small-Group Tryout
            If a complete revision is necessary, which should rarely be the case, another small group should be selected for the purpose of a second instrument pre-test. Other wise, instructional designers should revise items, based on their notes from the previous step, to improve clarity.
Step 8 : Testing the Instrument on a Larger Group
            The next step is a field test of the instrument on a larger group under conditions resembling, as closely as possible, those in which the instrument will later be used. The results of the field test should be noted.
Step 9  : Using the Instrument-But Establishing a Means of Tracking Experience with it
                        Instructional designers should use the instrument but should also establish a way of tracking future experience with it.
Step 10 : Revising the Instrument-or Specific Items-Periodically
                        As performance measurements are made using instruments, instructional designers gain experience. They can take advantage of that experience by periodically revising the instrument, or specific items on it.
Other Methods of Measuring Performance
1)         An advisory committee
2)         An external assessment centre
3)         An attitude survey
4)         A group discussion
5)         An exit interview
6)         A performance appraisal
C.                                       Udging Performance Measurements
Intructional designers should be capable of judging performance measurements they or their colleagues have developed whhen they are provided with a performance measure and are furnished with necessary information on the characteristics of learners , the settings in which they are expested to perform, constraints on performance and instructional development, instructional objectives, and plans for analyzing needs and evaluating results as applicable.
D.                                         Ustifying Performance Measurements
Instructional designers should also be capable of explaining their reason for the developing performance measurements and instruments as they did. As in most instructional design activities, they should consider themselves accountable for waht they do. Consequently, they should be prepared to answer questions posed by other stakeholders.
E.                                        Acting Ethically in Developing Performance Measurements
Performance measurements devised by management alone may not enjoy the ownership of workers-and may not even be realistic. Further, workers may be concerned about how the results of performance measurements will be applied to them as managers make future employment decisions.
F.                                         Applying Cross-Cultural Awareness to Developing Performance Measurements
      In many Asia and European cultures, students advance through formal schooling only by demonstrating competence through paper-and-pencil testing. That practice in unlike the educational system in the Unites States. Fpr that reason, testing in training contexts may be regarded much more seriously in Asian and European cultures than in the United States. Instructional designers should thus be aware that, by measuring learner performance through tseting, they may exert on workers tremendous (and perhaps undue) pressure to excel. As a consequence, special care should be taken to clarify why testing is worthwhile and how the results will be used in making employement decisions.
SEQUENCING PERFORMANCE OBJECTIVIS
sequencing instruction should usually occur after work tasks have been analyzed, performance objectives have been written, and performance measurements have been developed. it ensures that workers are introduced systematically to what they must know or do to perform competently . the resulting sequence of objectives becomes the basis of an instructional outline, sometimes called an instructional syllabus. it is a blueprint of choosing an instructional strategy and selecting, modifying , or preparing instructional materials ( in this chapter , we will describe approaches to sequencing performance objectives, offer simple advice to instructional designers about judging and justifying sequencing decisions, and mention key ethical and cross cultural issues in sequencing performance objectives
APPROACHES TO SEQUENCING
There are at least nine approaches to sequencing performance objectives and the instruction planned to meet those objectives : 
1.                  Chronological sequencing
The content is arranged by time sequence with the presentation of later events preceded by discussion of earlier ones. Instruction is sequenced fro past to present to future. This is typically used with history.
2.                  Topical sequencing
When performance objectives are sequenced topically, learners are immediately immersed in the middle of a topical problem or issue. Learners are then led back in time to see how the problem originated and at times forward to see what will happen if the problem is not solved. For instance a recent newspaper article on water pollution could be the starting point for instruction on Agricultural Waste Management Systems.
3.                  Whole-to-part sequencing
Learners are presented with an overarching logic to govern what they should know. In this way, they can see how each part relates to a larger conceptual system. Learners are first presented with a complete model or a description of the full complexities of a physical object, abstraction or a work duty. Instruction is then organized around parts of the whole. Examples are the hardware in a computer system, the instructional design process, or the job of an employee development specialist. Continuing with the examples; in whole-to-part sequencing, instruction would go from computer system to components, from design process to steps, and from job to duties.
4.                  Part-to-whole sequencing
Learners are presented with each part of a larger object, abstraction, or work duty. By the end of instruction, they should be able to conceptualize the entire object or abstraction or be able to perform the entire duty. For the example immediately above, instruction would go from components to the computer system, from steps to design process, and from duties to the job. 
5.                  Known-to-unknown sequencing
Learners are introduced to what they already know and are gradually led into what they do not know. For example in teaching how to develop Web pages using HTML, the instructor finds out how much the students know about the Windows environment and how experienced they are with the Internet before launching into instruction on HTM
Ø    Applying Cross Cultural Awareness To Sequencing Performance Objectives
            What is the culture’s preference for synchonicity ? the answer to this question is most relevant to applying cross cultural awareness to sequencing performance objectives. In this context, synchronicity means “ occurring at the same time”
Western cultures tend to be synchronous societies ( hofstede, 1991 oden wald, 1997 ). Time is viewed as a straight line. It exists outside individuals. Learners prefer to start on time and at the beginning, progress through instruction in logical sequence, and on time
In asynchronous cultures, however, time is viewed as a circle. It exists inside individuals. Scheduled starting and anding times are less important. Consider, as a simple example, preferences about movie schedules, which can provide valuable clues about the culture. In the united states, movie star and according to a fixed schedule. People usually want to watch a movie completely, so they appear at the movie theater when the show is scheduled to begin. This practice displays a preperence four synchronicity.
DELIVERING THE INSTRUCTION EFFECTIVELY
Choosing media
To plan to achieve performance objectives, instructional designer should also choose a medium, or media, after selecting an instructional strategy. The term medium just means the way an intructional massage is communicated to the learner. Although the term media has not always been use consistenly by instructional designer, example are easy enought to identfy : books, programmed texts, computers, slides or tapes, videotape, and film.
A media selection model, sometime call just a media model, is a decision-making aid. It is intended to guide selection of instructional media according to their instructional and cost effectivennes. Many media selection models has been devised to help instructional designer, such as the classics by Reynolds and Anderson (1992). However, it should be noted that “half a century of research on media has yielded little in the way of general guidelines for media selection. That is we are not able to conclude that one medium or combination of media is more effective overall, or even that one medium works better for a particular type of learning or category of subject matter” (Gagne and Medsker, 1996, p. 181).
The Range of Media
Instructional media range from simple to complex. This distinction can be understood  in two ways. First, a medium that does not require much advance preparation can be considered simple, while one require much preparation can be considered complex. For example, direct experience-possibly occurring on the job-is simple because it does not require much preparatio. Second, a medium that appeals to only one sense can be considered simple : a medium appealing to more than one sense can be considered complex. The fewer the senses to which instruction is designed to appeal, the less need there is to be concerned about the effect on each sense and about how media can appeal to the learners senses in combination.
               The classification scheme below is listed from complex to simple media. The simplest media are pleaced at the bottom of media “cone”; more complex media are pleaced at the top. This scemes is based on a classic list by Kemp (1985).
Media                                        Example
Combinations of media             interactive video
                                                   Multi-image and sound computer based training
                                                   Multi-image / Videotape
                                                   Multi-image / Audiotape
                                                   Filmstrip / audiotape
                                                   Slides / audio tape
                                                   Print / videotape
                                                   Print / audiotape
Projected motion pictures          videotape
                                                   Film
Project still pictures                   computer program (displayed)
                                                   Overhead transparencies
                                                   Slides
                                                   Filmtrips
Audio recordings                       Compact disk recordings
                                                   Audiocassette recording
Non Projected matrials              Job aids
                                                   Photographs
                                                   Diagram
                                                   Charts
Graphs
                                                   Flip chart
                                                   Chalkboard
                                                   Print materials
Tangible objects                         Models
                                                   Object / Devices / equipment
                                                   Instruction / Speakers
Selecting Delivery Modes
To plan performance objectives, instructional designer should also choose a delivery mode. A delivery mode means the choise made about the conditions under which instruction is to be offered. Not to be confused with media or instructional strategy, delivery mode is synonymous with the situation that confronts learning as they learn.
The range of delivery modes is not great. There are only four basic choices, according to a classic discussion of this issue (Ellington, 1985)
1.      Mass instruction involving many learners.
2.      Group instruction involving fewer learners.
3.      Individualized instruction involving only one learner at a time.
4.      Direct experience involving real-time learning, such as informal on the job training.
Make a selection of delivery mode based on the performance objective to be achieved. (see figure 11.2) if many people share the same instructional need, select mass instruction. It is appropriate, for instance when everyone in the same organization should receive the same instruction. If only some people, such as employees in one work unit, require instruction, select group instruction. It is often appropriate for introducing new work methods or new technology. If only one person experiences an instructional need, select individualized instruction. If the need is a minor one-not really enough a “chunk” of information to warrant preparation of a planned learning experience-then rely on such direct experiential methods as supervisory coaching or on-the-job training to supply learners with they need to perform competently (Rothwell and Kazanas, 1994b; Rotwell and Kazanas, in press).
Once the delivery mode for the entire learning experience has been selected on the basis of terminal performance objectives, reconsider media selection for each enabling objective Ellington, 1985).
Appreciating the Learn’s Perspective: A Brief Overview of Cognitive strategies
Just as much attention should be devoted to appreciating the learner’s perspective as the instructional designer’s perspective. Savvy instructional designer will thus think about “cognitive strategies” write Gagne and Medsker (1996, p. 72), “are the learned capabilities that enable us to manage our own thinking and learning processes”
Input Cognitive Strategis
An input cognitive strategy depends on what learners choose to pay attention to. Learners may be stimulated to pay attention by events external to them, by their own choice, or by a combination. An example of external stimulation might include job loss, which would create a significant emotional event for learners that would stimulate their learning on the job search. An example of internal stimulation might include remembrance of career goals, which could motivate individuals to seek out new approaches to meeting those goals.
       Process Cognitive Strategies
An process cognitive strategy helps learner make sense of what they learn. Gagne and medsker (1996, p. 75) list several :
·                     Rehearsal : trying out something new
·         Elaboration : associating something new with something previously learned
·     Organization : imposing a structure on what is newly learned through such methods as outlining, categorizing, or diagrammin
Output Cognitive Strategies
                  An output cognitive strategy means that learners acquire new knowledge or skill by               applying what they have learned and making meaning of their experiences. An example would be asking learners to prepare instruction on something they would like to learn. The teaching (output) focuses the learners attention on organizing the new knowledge or skill to teach it to others. That is an output-oriented cognitive strategy. Individuals could use the same approach to make sense of what they want to learn.
       Feedback Cognitive Strategies
          A feedback cognitive strategy means that learners acquire new knowledge or skill by giving feedback to others. An example would be asking learners to hear a speech and provide feedback to another person about that speech. The process of giving feedback focuses the learners attention on organizing the new knowledge or skill to provide feedback to others. That is a feedback oriented cognitive strategy. For more information on cognitive strategy this discussion has been quite limited-see, as a starting point , clark (1992 : 20113).
       Judging Instructional Strategy 
     Instructional designers should be capable of judging instructional strategies specified by themselves and their colleagues. Instructional designers can thus evaluate how appropriate the strategy is. Instructional designers may find it useful to rely on a worksheet, like that shown in Exhibit 11.1, when they are called on to judge a specifield instructional strategy. Every time an answer of no must be given, the instruction designer should reexamine the instructional strategy.
      Justifying Instructional Strategy
            Instructional designers should be capable of justifying the instructional strategy they have chosen. As in most instructional design activities, they are held accountable by other stakeholders for what they do. Instructional designers should thus be prepared to answer questions such as the following:
·      Why was an instructional strategy chosen?
·      What assumptions guided the choice of strategy? More specifically, what did instructional designers assume about the nature of learning and instruction?
·      Who should care about the instructional strategy?
·      Why should stakeholder care about the instructional strategy?
     Acting Ethically in Specifying instructional Strategies
            A key ethical issue in specifying instructional strategies can be expressed by this question: Has as much emphasis in the instructional design process been placed on cognitive strategies? A danger exists in placing too much emphasis on instructional strategies. Doing that may diminish the learners role and lead to an overemphasis on glitzy technology rather than on the result. In structional designers who act ethically will pay as much attention to learners as to instructors and as much attention to how learners can be helped to learn as to instructional strategies. The two issues are very much related.
Choosing Or Designing Instructional Materials
All the previous steps in the instructional design process led to the instructional materials that will help learners achieve the desired performance goals.
A.             An Overview Of Steps Selecting Or Designing Instructional Materials
Instructional designers take several steps to select, modify, or design instructional materials :
1. preparing a working outline
2. conducting research
3. examining existing materials
4. arranging or modifying existing materials
5. preparing tailor-made instructional materials
6. selecting or preparing learning activities
Step 1 : Preparing A Working Outline
Preparing a working outline, sometimes called a syllabus, is the first step in designing instructional materials. a working outline summarizes the contents of the planned learning experience.
Step 2 : Conducting Research
            Conducting research, the second step in designing instructional materials, is carried out to identify materials available inside or outside an organization. Suffice it to say that the cost of developing tailor made materials is usually formidable.
Step 3 : Examining Existing Instructional materials
Evaluating exiting instructional materials is the third step in the process of the signing instructional materials.
Step 4 : Arranging or Modifyng existing materials
Arranging or Modifyng existing materials  is the fourth step in the process of designing instructional materials.
a.    Securing copyright permissions
b.   Arranging instructional materials
Step 5 : Preparing Tailor-made instructional materials
a.    Traditional Components of an instructional package
b.   Differences of opinion about components of an instructional package
Step 6 : selecting or preparing learning activities
a.    Selecting existing learning activities
b.   Preparing individual learning activities
Examples of individualized learning activities may include the following :
·         Reading a book
·         Interviewing others
·         Reviewing documents
·         Addressing a group on a new topic
·         Finding a problem
·         Researching a subject
·         Watching a videotape
·         Observing others
·         Demonstrating a skill
·         Performing a job
·         Starting something new
·         Solving a problem
Preparing group learning activities
Group learning activities are perhaps most frequently with experiential instructional methods in classroom setting. While results of research studies on the relative effectiveness of group learning activites in classroom instruction have proved largely inconclusive (see, for instance, carroll, paine and ivancevich, 1992;Newstrom,1980) it appears that some group learning activities are better suited than others for meeting specific types of performance .
a)      an interview guide for collecting case-study information
b)      a framework for preparing a role play
c)      an interview guide for gathering information on critical incidents
d)     an interview guide for gathering information on critical incidents
EVALUATION INSTRUCTION
Instructional designer often believe that instruction is not finished until it is apparent that the targated learners can learn from the material. Concerned with helping formulate instruction, this step in the instructional design process calls for formative evaluation. Usually distinguished from summative evaluation, which helps summarize results of instruction (Bloom, Hasting, and Madaus 1971), formative evaluation is conduted before instructional materials are delivered to a majority of the targeted learners. Summative evaluation is conducted after instructional materials have been used with targeted trainees and results have been measurd.
1.            Assumptions About Formative Evaluation
      Instructional designers make three fundamental assumptions when evaluating instructional material and methods.
·   They view evaluation as primarily a formative process
·   Instructional designers assume that evaluation means the process of placing value on something (Rothwell and  Sredl,2000)
·   Instructional designers expert to collect and analyze data as part of the evaluation process
2.            Defining Terms Associated with formative Evaluation
              Instructional designers should take the time to familiarize themselves with at least two key             terms:    formative product evaluation and  formative process evaluation
·   Formative Product Evaluation
  formative  product evaluation means the process of apparaising instructional materials    during preparation.
·   Formative Process Evaluation
 Formative process evaluation is related to formative product evaluation and means the  appraisal of instructional methods, that is,how planned learning experiences are        delivered or facilitated.
3.               Four Major Approaches to Conducting Formative Evaluation
     a. Expert reviews
   Expert reviews focusing on content are, by definition, conducted by subject matter experts (SMEs) individuals whose education or experience with respect to the instructional content cannot be disputed.
b.Management or executive rehearsals.
   Management or executive rehearsals are different from expert reviews. They build support by involving key stakeholdersin the preparation and review of in structional materials prior to widespread.
c. Individualized pre-tests and pilot tests.
   Individualized pre-test , conducted another approach to formative evaluation.
d.                        Group pre-test and pilot tests
   Their purpose is to find out just how well randomly selected group of participants from the targeted trainee group fares with the instructional.
Conclusion
the final step in the model of the instructional design process we unveiled in chapter four, formative evaluation provides a means by which to improve instructional materials before they are released for widespread use.in this chapter, we clarified assumptions about formative evaluation, defined key terms associated with it, provided a case study to dramatize important issues in developing a formative evaluation plan,offered advice about judging and justifying formative evaluations,and reviewed key ethical and cross cultural issues affecting formative evaluations. in the remaining chapters of this book, we turn to competencies linked to managing and communicating about instructional design projects. in the next chapter, we focus on the instructional management system that is essential to ensure that learners can receive the training, that they can begin in theproper place, and that their progress can bee appropriately tracked.

Tidak ada komentar:

Posting Komentar