Proposed to discuss and describe the intricate evaluation process

The survey was launched at the beginning of the new academic year 2006. Before that, the questionnaires had been scrutinized by the supervisor and other colleagues in the ESP Department. All the wordings had been carefully revised and edited to ensure a high level of transparency. During the week to prepare for the new academic year beginning August 21, 2006, the survey instruments were provided in person to all the staff in the Division of English for Electronics and Telecommunications. They were asked to return their responses by September 1, 2006. The deadline was then extended, and all surveys received were included in the results. Three weeks later, in the orientation starting the new academic year, the Vietnamese version of the questionnaires were distributed face-to-face to the students of all the four third-year classes majoring in Electronics and Telecommunications. It took them about thirty minutes to complete. The evaluation questionnaire’s objectives, methods, and issues like how the target subjects were selected, the data were collected, and the reports on each gauged category were prepared, are to be described below.

doc35 trang | Chia sẻ: superlens | Lượt xem: 1868 | Lượt tải: 2download
Bạn đang xem trước 20 trang tài liệu Proposed to discuss and describe the intricate evaluation process, để xem tài liệu hoàn chỉnh bạn click vào nút DOWNLOAD ở trên
Chapter 3: Analysis of the survey questioNnaire This particular paper reports on a survey that was conducted at the Faculty of Electronics and Telecommunications, COLTECH, VNUH for the purposes of evaluating the textbook OEE. The courses based on this textbook have been available to undergraduate students for four years. However, in 2006, the ESP staff began to implement the project of writing new ESP materials to replace the old book. It was decided by the teaching staff soon thereafter that a research project needed to be initiated in order to determine the overall pedagogical value of this book so as to lay the foundation for developing new materials. 3.1 Description of the survey questionNaire The survey was launched at the beginning of the new academic year 2006. Before that, the questionnaires had been scrutinized by the supervisor and other colleagues in the ESP Department. All the wordings had been carefully revised and edited to ensure a high level of transparency. During the week to prepare for the new academic year beginning August 21, 2006, the survey instruments were provided in person to all the staff in the Division of English for Electronics and Telecommunications. They were asked to return their responses by September 1, 2006. The deadline was then extended, and all surveys received were included in the results. Three weeks later, in the orientation starting the new academic year, the Vietnamese version of the questionnaires were distributed face-to-face to the students of all the four third-year classes majoring in Electronics and Telecommunications. It took them about thirty minutes to complete. The evaluation questionnaire’s objectives, methods, and issues like how the target subjects were selected, the data were collected, and the reports on each gauged category were prepared, are to be described below. 3.1.1 Objectives The survey was designed to accomplish the following objectives: Compare the perceptions of the students and their teachers regarding the relative significance of each evaluative criterion Identify (and assess) the gaps for improvement in the textbook OEE as well as its accomplishment Develop a future assessment instrument to gauge up-coming teacher-generated materials It is hoped that the evaluation checklist would be of good use to other colleges or universities should they choose to adopt similar survey instruments for their own materials. 3.1.2 Targeted subjects While the decision to use and evaluate a particular textbook is sometimes left up to individual teachers, some authors such as Chambers (1997) have pointed out that this activity is usually more beneficial if it is collectively undertaken by everyone involved in the teaching and learning process. He suggests that when teaching materials are to be used by a large group of teachers and students it seems sensible for these materials to be evaluated by all or most of those who are involved in their use. As such, this study relied on the active participation of all eight ESP course instructors of Division of English for Electronics and Telecommunications as well as the approximate one hundred and twenty students who were enrolled in the K49 programs. 3.1.2.1 The teachers The survey questionnaires were delivered to all the eight members of the Division of English for Electronics and Telecommunications. All of them have experienced teaching OEE for at least one semester (two members) and at most seven (four members). Besides, six out of eight members did involve in the design and revision of ‘English for Electronics and Telecommunications – Volume 1’ in 2003. Seven of them have completed or are pursuing the Master course in CFL, VNUH. As mentioned in the previous chapter, two teachers of this group have been sent to De la Salle University for a one-month course on Syllabus Design and Curriculum Management; and four of them have taken part in the course on Tertiary Education Reforms catered by Faculty of Education, VNUH. Those facts may probably lead to a firm conclusion that though being young at age, the course instructors have been adequately trained to design and evaluate materials and, therefore, their judgments about OEE are reasonably reliable. 3.1.2.2 The students Because of the relatively small number of third-year students in the faculty of Electronics and Telecommunications, the project team decided to survey the whole population rather than select a proportional sample. Approximately, the survey population consists of 120 members in total. To make it less complicated, the students who reviewed and rated the textbook were restricted to K49 members who have worked with the first half of the book. They were required to review carefully and rate the book according to the criteria provided basing on their own experience. To ensure that the reviewers could understand the evaluation criteria and would consistently follow the evaluation procedure, the checklist was translated into Vietnamese. Moreover, an extensive explanation about the rating and weighting schemes was offered to every class to eliminate the chances of misunderstanding. After attending a discussion of the evaluation criteria and applying them to a number of examples, the reviewers had opportunities to review the book and work with the ratings individually. 3.1.3 Materials examined The time and rigor required for the analysis procedure made it impractical to evaluate the whole set of OEE including a student book, an answer book and a cassette tape. Therefore, the analysis was based merely on the workbook. The evaluation criteria were based on existing analysis of students’ needs and current research of the field and were organized in five categories, each of which focuses on one of the facets: organization/format, electronics content, language content, skill, and methodology. 3.1.4 Methods and procedures The study follows a qualitative research approach to describe qualitatively the different ways in which the students and the teachers experience, understand, and gauge the course book OEE. From another angle, this research can also be regarded as parts of an action research that aims at improving the quality of teaching materials. Information requested from the participants included rankings of the various areas covered by five categories in the questionnaire. Additional spaces were provided for any other comments that each respondent cared to make. The data reported in this thesis also includes a portion of the comments and responses to the open-ended questions. The analysis procedure of survey results was performed in several steps. Step 0: Questionnaire Design: The questionnaire was elaborately constructed focusing on areas that would be the most important of any ESP course book. With the assistance of the supervisor and other colleagues, the initial survey questionnaire was revised in terms of contents, wordings, and structures in order to ensure its clarity and validity. Step 1: Data Collection: After several necessary alternations, the questionnaire was duplicated and delivered to the targeted instructors and students. The survey was self-administered and the data was collected about thirty minutes after delivery. Step 2: Data Collation: The data from the course instructors and the students were synthesized by hand and compiled in form of tables. Those figures were then computerized and processed using the Microsoft Excel package, a popular software program utilized by social and behavioral scientists. Step 3: Data Analysis: In the first place, standard frequency distributions were computed for both the importance assigned to, and the realization of each criterion in the book. Secondly, all the means of the weighting scheme from two groups of respondents were calculated. In terms of rating scheme, the number of respondents with positive feedback (answers with mostly evident and fully evident) were summed up and changed into percentage. Thirdly, for each category, these figures were graphically co-presented on charts for the shake of making comparison. These tools typically improve inferences by making statistical analyses quantitative rather than qualitative. Moreover, during the course of analyzing, the results from the questionnaire, information from open questions may also be quoted to support or challenge the findings. Consequently, all these statistics was combined to identify “gaps for improvement” in OEE. Criteria regarded as significant and having the largest “gaps” (partially or scarcely evident) should receive greater emphasis when designing the substitute materials. 3.2 Data collation & Analysis 3.2.1 Response rates: As mentioned above, the survey targeted at two separate samples: the teaching staff and the learners. For the first group, eight copies were delivered to the former group that accounted for the whole population of the Division of English for Electronics and Telecommunications. With the sincere cooperation of all the members, the response rate of this group attained 100%. For the second group, 120 other copies of translated questionnaires were distributed to the students and 109 responded to the survey. However, seventeen respondents did not fully complete the questionnaire as required, so the number of valid responses was restricted to 92. This number resulted in a response rate of 76.66%, considered very satisfactory for a written survey. 3.2.2 The significance of each criterion This part focuses on the judgment of the evaluative criteria by themselves devoid of how they are realized in the book. The results of forty-three questions are clustered under five main headings corresponding to the five facets examined in the questionnaire. The relative significance of each criterion is judged based on the average mean of two samples. It is assumed that the higher the average mean is, the more important the criterion is supposed to be. In the following sections, each weighting against the three-point scale will be discussed in more details. 3.2.2.1 Format/Organization One of the most useful starting points in any textbook evaluation is an analysis of the format and organization. The students and course instructors surveyed for this project indicated a general consensus on the importance of each criterion even though the level of significance varies marginally. The teachers and the students shared the same view on most of the criteria such as useful table of contents, glossary and index (1); clear and comprehensive introductions (2); appropriate print size (4); visually appealing format (5). However, while the teaching staff put a high emphasis on periodic assessments (3), the learners stressed more on the uniformity of the content layout (7). 0 0.5 1 1.5 2 2.5 3 3.5 Students 2.38 2.4 2.12 1.68 2.29 2.09 2.42 Teachers 2.75 2.88 2.88 2.25 2.88 2.25 2.38 1 2 3 4 5 6 7 Chart 3.1: The significance of the format/ organization criteria. The organizational criteria are ranked by descending significance as follows; with the heaviest weightings on explicit preambles (with the average mean of 2.64) and visual appealing format (with that of 2.59) and the lightest on appropriate print size (of 1.87). Means of Weighting Organizational criteria Students’ Teachers’ 2.40 2.88 Units contain clear and comprehensive introductions of objectives.(2) 2.29 2.88 Format is visually appealing.(5) 2.38 2.75 Textbook provides a useful table of contents, glossary & appendix.(1) 2.12 2.88 There are periodic assessments.(3) 2.42 2.38 The textbook uniforms in content layout (throughout as well as in each unit). (7) 2.09 2.25 The table of contents shows a logical development of the subject. (6) 1.68 2.25 Size of print is appropriate. (4) 2.20 2.61 Average Table 3.1: Format/organization criteria as arranged in descending order of importance The teaching staffs were sensible while they placed an intense emphasis on such issues as statements of objectives (2), pleasant-to-the-eye format (5), and continuous assessments (3). These findings go in accordance with Tomlinson’s principles that learners must be ready to acquire the points being taught, and materials should achieve impact. However, the students did not show as much enthusiasm in periodic assessments as many of their teachers expect. This fact serves as a solid proof for Tomlinson’s point of view that materials should help learners to feel at ease because “relaxed and comfortable students apparently can learn more in shorter period of time” (Dulay, Burt and Krashen, 1982). Ranking at the lower end of the table are such concerns as uniformity (7), logical development of the content (6), and appropriate size of print (5). Some students suggested that besides the table of contents, the book should include resources for reference and further study particularly online materials. Some others recommended covering a list of typical grammatical reference that will be very convenient for their revision and self-study. These brilliant ideas should be taken into consideration when the project of design new materials is to be implemented. 3.2.2.2 Electronics content Many theorists believe that since language and content cannot be distinctly separated from each other, which holds true particularly in ESP. When using ESP textbooks, students are obviously exposed to not only the English language but knowledge of their specific major as well. In order to become fluent in ESP, students are required to master a significant portion of the content area. This principle establishes that a fundamental requirement for ESP textbooks should be to display a proper representation of the subject content. 0 0.5 1 1.5 2 2.5 3 3.5 Students 2.73 2.37 2.36 2.23 1.95 2.17 2.21 2.46 2.37 2.22 Teachers 3 2.88 2.88 2.13 2 2.25 2.38 2.13 2.25 2.75 1 2 3 4 5 6 7 8 9 10 Chart 3.2: The significance of the electronics content criteria. Generally, almost all the criteria stated were regarded as important and highly important. The first three were deemed to be of extreme importance: materials are supposed to be aligned with the occupational demands (with the average mean of 2.87) (1) as well as the course objectives (that of 2.63) (2). Moreover, the subject matters need to be up-dated (3) in order to reflect new trends in electronics and telecommunications technology. Concerning the issue of further reading (10), there appeared a divergence between two groups of respondents. Unlike their instructors, the students showed a certain degree of reluctance in studying supplementary materials. Another discrepancy can be found in the opinions on realistic examples (8). The students paid higher attention to examples than their instructors did, which deserves more pedagogical consideration from the instructors. For the rest of the criteria, the data analysis showed a close convergence between two groups of respondents. They shared the same point of view on five out of ten criteria in this part, which are appropriate sequence of content (4); balance of content (5); material builds on the students’ existing knowledge (i + 1) (6); well-integrated non-text materials (7); and linkage with other subject areas (9). (See chart and table 3.2). Means of Weighting Electronics Content criteria Students’ teachers’ 2.73 3.00 The content meets the requirements for the occupational outcome(s). (1) 2.37 2.88 The content logically follows course objectives stated. (2) 2.36 2.88 The subject matters are up-to-date. (3) 2.22 2.75 The content includes suggestions for further reading. (10) 2.37 2.25 The lessons are linked to other subject areas. (9) 2.21 2.38 Non-text content (maps, graphs, pictures) are well integrated into the text. (7) 2.46 2.13 Examples are realistic. (8) 2.17 2.25 The material builds on students' existing knowledge.(6) 2.23 2.13 The sequence of content is appropriate; i.e., from comprehension to production. (4) 1.95 2.00 The content is balanced; i.e., one topic is not discussed to the detriment of others in terms of length. (5) 2.31 2.47 Average Table 3.2: Electronics content criteria as arranged in descending order of importance Other significant recommendations included guidelines for the Internet resources especially sources of e-books or provision of journal articles for the students to read and translate as homework. 3.2.2.3 Language content Under ‘language content’, the textbook evaluation form asked the participants to consider whether the language included in the materials is realistic or applicable to the work-related environment. It also examined the extent to which the textbook encourages both personalization and specification whereby students are required to use language that they have learned in order to engage in purposeful and genuine situations. 0 0.5 1 1.5 2 2.5 3 3.5 Students 2.33 2.37 2.09 2.14 2.09 2.34 2.32 2.24 2.42 2.37 Teachers 2.25 2.25 2.13 2.25 2.88 2.75 3 3 2.88 2.63 1 2 3 4 5 6 7 8 9 10 Chart 3.3: The significance of the language content criteria As revealed in the chart 3.3, there emerged a coincidence in opinions of two samples on most of the key issues concerning grammar (1), vocabulary (2), pronunciation (3), structuring and conventions of language use above sentence level (4), and “real” tasks (10). However, in the scale of importance this cluster of criteria ranked at the bottom of the list. (See table 3.3) On the contrary, the teachers showed the utmost concern about the level of difficulty (7) and the suitability of the material to students’ various abilities, interests and learning styles (8), whereas their students gave their top priority for the kind of occupational- oriented language (9). This result indicated that students nowadays are very realistic and nothing else may propel them to learn better than the coverage of work-related language such as language for job interview or application letter. Means of Weighting Language Content criteria Students’ Teachers’ 2.32 3.00 Level of difficulty is suitable for the students’ acquisition. (7) 2.42 2.88 The activities ask the students to practice something that they are likely to use in their future jobs. (9) 2.24 3.00 The activities are appealing to a wide range of student abilities, interests and learning styles. (8) 2.34 2.75 Directions are clearly written and explained. (6) 2.37 2.63 The activities that ask the students to work together give them a “real” task. (10) 2.09 2.88 Text-types are various including manuals, letters, dialogues, charts, diagrams, etc. (5) 2.37 2.25 Vocabulary is adequate in terms of quantity and range. (2) 2.33 2.25 The book covers the main grammar items characteristic of EST. (1) 2.14 2.25 The course book deals with the structuring and conventions of language use above sentence level, e.g. how to take part in conversations, how to structure a piece of writing. (4) 2.09 2.13 The course book includes material for pronunciation work. (3) 2.27 2.60 Average Table 3.3: Language content criteria as arranged in descending order of importance 3.2.2.4 Skill 0 0.5 1 1.5 2 2.5 3 3.5 Students 2.47 2.35 2.12 2.24 1.9 2.08 2.24 2.29 2.38 Teachers 2.5 2.75 2.63 2.63 2.63 2.88 2.63 2.75 2.75 1 2 3 4 5 6 7 8 9 Chart 3.4: The significance of the skill criteria On the overview, the teacher respondents addressed more concern over skill area than their students. According to the figures, language skills were not deemed the most favorable criteria but high-order thinking skills such as analyzing, synthesizing, and evaluating (9) were (with the average mean of 2.58). This fact is understandable because language is naturally regarded as the crust of thinking. Another reason lies in the fact that these skills have not been paid adequa

Các file đính kèm theo tài liệu này:

  • docChapter3_Analysis%20of%20the%20survey%20questionnaire%203.doc
  • docChapter%202_literature%20review2.doc
  • docChapter%205_Conclusion.doc
  • docChapter1_introduction.doc
  • docChapter4_suggestions%20for%20new%20material%20design1.doc