|Year : 2020 | Volume
| Issue : 1 | Page : 16-21
Assessing the learning outcomes and perceptions of focused didactic training workshop in micrometry skills
V Dinesh Kumar, Yogesh Ashok Sontakke, HY Suma, K Aravindhan
Department of Anatomy, JIPMER, Puducherry, India
|Date of Submission||13-Mar-2019|
|Date of Decision||13-Jun-2019|
|Date of Acceptance||15-Jun-2019|
|Date of Web Publication||16-Dec-2019|
H Y Suma
Department of Anatomy, JIPMER, Puducherry
Source of Support: None, Conflict of Interest: None
Context: Micrometry skills, a combination of visual, cognitive, technical, and spatial expertise, are useful for ascertaining the quantitative research outcomes with concreteness. We conducted a workshop, based on modified Peyton's approach, which is aimed at transferring these skills for optimal research practice. Aims: Our principal aim was to document the perceived learning outcomes and knowledge gained from the focused didactic training workshop on micrometry skills. Settings and Design: In the 1-day hands-on workshop, basic concepts of micrometry skills were taught using ImageJ software. The learning gains of workshop were evaluated using Moore's expanded outcomes framework. Subjects and Methods: To achieve this, we administered pre- and post-tests along with a postworkshop evaluation survey form. Statistical Analysis Used: Descriptive statistical analysis was used to estimate the measures of central tendency. Results: Significant knowledge transfer could be demonstrated by increase in posttest scores. The feedback responses of the participants were overtly positive. Conclusions: To the best of our knowledge, this is the first documentation of the focused didactic micrometry skills training workshop using modified Peyton's approach. The framework might serve as a formidable base for conducting similar workshops in other medical schools.
Keywords: Histology, ImageJ, micrometry, Peyton's approach, skills training
|How to cite this article:|
Kumar V D, Sontakke YA, Suma H Y, Aravindhan K. Assessing the learning outcomes and perceptions of focused didactic training workshop in micrometry skills. Med J DY Patil Vidyapeeth 2020;13:16-21
|How to cite this URL:|
Kumar V D, Sontakke YA, Suma H Y, Aravindhan K. Assessing the learning outcomes and perceptions of focused didactic training workshop in micrometry skills. Med J DY Patil Vidyapeeth [serial online] 2020 [cited 2020 Jul 9];13:16-21. Available from: http://www.mjdrdypv.org/text.asp?2020/13/1/16/272889
| Introduction|| |
Histology is one of the educational and research domains which demands visual problem-solving abilities. At the undergraduate level, the visual abilities are mostly confined to pattern recognition and interpretation of histological slides. Moving from the stage of novice to proficiency, a student could be able to identify the microstructure based on visual interpretation and differential identification. The experts, i.e., anatomists and pathologists, tend to use comparative pattern analysis by reactivating the acquired mental imageries. These innate strategies and knowledge of structures differ among individuals, and they tend to manifest upon interaction with visual stimuli. Because the degree of interpretation varies according to the level of visual mastery and microscope navigation, the reliability is mostly influenced by interobserver variability.
Research in histology demands interpretation of specific visual characteristics such as color, three-dimensional interpretation of the visualized structures, and recognizing artifacts. As mentioned above, the sole reliance on qualitative morphological variables might jeopardize the results of these endeavors. To ascertain the research outcomes with concreteness, it is pertinent to perform quantitative measurements. Micrometry skills encompass the application of definite quantitative measurement such as calibrating the size of the visualized structures and counting the number of structures in a field. These skills, being relatively novel to the subject experts, involve the combination of visual, cognitive, technical, and spatial expertise. Furthermore, if the slides need to be navigated at more than one magnification fields, the spatial expertise regarding zooming and panning comes into picture.
The workshop aimed at making the faculty getting acquaintance regarding micrometry skills, transferring the acquired skills to research practice in their corresponding settings. The success of the focused skill laboratory training largely depends on various factors such as predetermined learning goals, utility of the topic chosen, and sustainability of the learned skills over time., Amidst the various instructional approaches available for teaching skills, we adopted the modified Peyton's approach , for micrometry skills after considering the expertise involved in it. The steps which were modified to fit the objectives of the workshop are as follows:
In the first step, the facilitator who is a faculty with prior training in micrometry skills demonstrates the skill. In the second step, the participants are guided through each subcomponents of the skill in detail under the assistance provided by the facilitators. Subsequently, the facilitator applies the steps in the demonstration sample and performs the skill sequentially. Finally, the participants are instructed to perform on their own and to check for any hiccups arising in the due course.
To our surprise, we were not able to find concrete evidence on how learning outcomes could be assessed following a micrometry skills workshop. Based on our Phronesis and bits from the works done on related areas, we tried to frame the components for the workshop. The essence of the workshop was building interaction between the participants who were novice, respective to particular skills and facilitators, who have garnered acquaintance over time. Our objective was that the relative knowledge gap which was existing between both groups should narrow and skill empowerment should happen in terms of capabilities. The challenge was that the learning content should be adapted enough to suffice the expected level of the participants.
| Subjects and Methods|| |
The study participants (n = 49) included the anatomists and pathologists who had registered for the workshop following our invitation. It should be noted that all of them had expertise in teaching either histology or histopathology. Except two, others were naïve regarding the software (Wayne Rasband at the National Institute of Mental Health, Bethesda, Maryland, USA) used in the workshop. The prereading materials were sent before the workshop to provide them the desired stimulation. The faculty members of the department of anatomy served as resource persons by rendering the didactic lecture and postgraduate residents served as facilitators for providing focused didactic training. Each resident catered 3–4 participants to help them in troubleshooting the issues and assisting in performing the hands-on training.
Upon arrival, they were provided with a pretest question sheet related to the basic concepts of ImageJ utility. The software (free downloadable version) was installed in their respective laptops. Sample images to work upon were provided. The framework of the workshop is summarized in [Table 1]. A separate station was earmarked for understanding the basics of micrometry calibration.
Out of the available software for image analysis, we preferred to use ImageJ, which is a public domain, Java image processing and analysis program. It can operate in the images of various bits and formats. The key functions are geometric transformations and spatial calibrations in the histologic images fed in the program. Because of its easy access, it is widely used by biomedical researches.
Each session of the 1-day workshop consisted of a lecture serving its objective, purpose of that particular function/tool in the ImageJ software, few overarching challenges which would prop up in the course of its usage, and hands-on training regarding that particular tool. To facilitate the process, a sample image containing thyroid follicles was given.
The main objectives of developing the questionnaire were to examine the impact, satisfaction, and implementation processes of the micrometry workshop on the participants. Thus, the questionnaire was framed to encompass the components such as (a) assessing the overall implementation and delivery, (b) measuring the degree of satisfaction regarding the teaching content and resources provided, (c) determining whether the workshop framework adequately met the speculated learning needs, (d) examining the impact on research practice, and (e) obtaining suggestions that could help in rectifying the flaws.
- The pre- and post-test question sheet consisted of 15 questions which aimed at testing the basic knowledge regarding the key features and utility of ImageJ. The questions were subjected to peer review to ascertain the retest reliabilities
- The evaluation questionnaire (18 Likert scale response-based and three free-text response-based questions) comprised of (i) questions to evaluate the quality, content, and usefulness of each session; (ii) four questions to evaluate the hands-on training session, three questions to evaluate the program as a whole, three questions to evaluate the arrangements including reading materials, and three questions to grade the self-perceived attainment of speculated learning needs; (iii) response-based questions for ascertaining prior experience with image analysis programs; (iv) free-text response questions to comment about facilitators and barriers for learning in the workshop; and (v) three questions to grade the self-perceived learning gains related to the objectives of the workshop. All questions were graded on a scale ranging from 1 to 5, with 1 being strongly disagree/not at all useful and 5 being strongly agree/very useful.
With the exception of free-text responses, responses of remaining items were compiled, and descriptive statistical analyses were used to obtain measures of central tendency.
| Results|| |
Being a part of workshop evaluation, the entire process including feedback, pre- and post-test, was given waiver from ethical proceedings. All the participants (n = 49), comprising of faculty and postgraduate residents, positively responded to the tests and also voiced their perceptions in an uninhibited manner. Because correlation of demographic variables with learning outcomes was not included as one of our objectives and in the process of maintaining anonymity, we had not included the specifications in the forms. Four out of 49 participants have attended prior workshops related to histological image processing (three – stereology; one – micrometry) and six out of 49 have already been using at least one of the technical modalities (predominantly Adobe Photoshop®, Thomas and Knoll, Adobe Inc., Michigan, USA) for postprocessing of images before sending it for publication.
The mean score of all the pretest forms was 7.8, with a standard deviation (SD) of 1.5, and the mean score of all posttest forms was 10.6, with a SD of 1.8 [Figure 1].
|Figure 1: Comparison of mean pre- and post-test scores of the participant|
Click here to view
Evaluation of sessions
The sessions were evaluated in terms of quality, content, and usefulness, and the mean score was obtained in three domains. Overall, the mean score for the quality of sessions was 3.86, with an SD of 0.36; for content, it was 3.62, with an SD of 0.54; and for the usefulness of the sessions, it was 4.23, with an SD of 0.21. Majority of the participants (83%) perceived that the hands-on training was adequate enough to suffice the objectives of the workshop. Nearly 92% of the participants felt that the instructions provided by the facilitators were clear and appropriate and their doubts were satisfactorily cleared, and 74% felt that the hands-on training session improved their skills in micrometry and analysis of histologic images. The overall rating of the hands-on training session was 3.9, with an SD of 0.43.
Satisfaction regarding arrangements
Majority of the participants were satisfied regarding the arrangements, including reading materials, distribution of content, and audio–visual arrangements made by the organizers, with 44% finding it to be “excellent” and 53% finding it to be “satisfactory.” Almost 92% felt that the workshop covered all the important aspects required for using micrometry in their research practice and expressed their will for conducting regular workshops of the same sort in future.
Self-perceived improvement in learning outcomes and gains
The degree of utility is summarized in [Table 2].
|Table 2: Compilations of responses (1 - being least useful/strongly disagree and 5 - being most useful/strongly agree)|
Click here to view
The pertinent responses for the three questions that allowed free-text answering are compiled in [Table 3].
|Table 3: Pertinent responses for questions that allowed free-text answering|
Click here to view
| Discussion|| |
We had originally developed the framework for micrometry skills workshop with the intention of transmitting its basics and having peer faculty use some of the learned practices in their corresponding settings. Because we could not find valid literature exploring the learning impact of the micrometry skills, this article is designed to assess the effectiveness of the framework. Going by the fact that in the health-care setting, until a proposed change is adequately supported by scientific evidence of benefit it would not achieve its intended learning outcomes, this article might add the “strength of evidence.” As the sample size was small, the results of mere quantitative analysis could not be considered with meticulous basis and therefore, we considered additional qualitative components for eliciting free-text responses that would fortify the evidence.
Considering the ideal criteria, skills that are chosen for the workshop should be trainable and practicable at low-resource settings at small scale. In addition, they should be aligned to the broader goals of the discipline and provide advantage to the existing practice at least to an extent. Thus, we felt that micrometry would be an optimal area for sustainable knowledge transfer in anatomy discipline. The assessment framework is based on the Moore's expanded outcomes framework. The first level (participation) was accomplished by ensuring the attendance of the participants. The second level (satisfaction) was ensured by the components of the questionnaire. For assessing the third level (learning), objective measures, i.e., pre- and post-tests, and subjective measures, i.e., self-perceived learning gains, were used. In addition, the level of satisfaction of the hands-on session and performance under assistance helped us achieve the level 4 of Moore's framework. Questions regarding the environmental settings were also included because of the fact that significant learning activity takes place in the context of “authentic” settings.
Comparison of the pre- and post-test scores reveals that there was a statistically significant learning gain among the participants. Nevertheless, because the questions were based on cognitive facts, the increase in the posttest scores cannot be corroborated to the gaining of skills. It was presumed that the modified Peyton's approach, particularly the second and third steps whereby the facilitator guides the participants through the subcomponents and sequentially, applying in the demonstration sample, served the crucial purpose for “internalizing” the skills. As pretrained facilitators were deployed, who helped in navigating the software and gaining the skills in a ratio of 1:4, the alignment of the hands-on training to the workshop objectives was achieved to our satisfaction.
The high levels of enthusiasm and positive response related to the usefulness of the session, self-perceived learning gain, and ability to meet the intended outcomes of the workshop were clearly reflected in the response sheets. Our main aim was to look at the improvement in the knowledge and skills  and assess up to the Kirkpatrick's second level as mentioned above. To some extent, level 3 has been achieved by seeing their application of knowledge in the given sample histologic images. Thus, the cardinal objectives of the workshop have been achieved to a satisfactory extent.
Considering the sample size, it is not imperative enough to generalize the findings. The exact gain should be the ability of the participants to demonstrate the difference in performance and also by increased participatory yields in the form of working tasks (long-term effects). Considering the feasibility, controlled intervention to document the concrete learning gains could not be performed. In addition, even though our participants ranged from postgraduate students to senior professors, we could not stratify them based on their experience for two reasons: first, in order to maintain the anonymity, we decided to include only two questions (specialty and faculty/postgraduate) and second, learning micrometry is relatively a new technique for most, if not all, participants. Out of the participants, only one had previous experience with Image J software. This rules out the influence of experience on perceived learning outcomes. However, these data could enable other medical schools to design their own framework and support another faculty cohort regarding these skills.
| Conclusions|| |
Even though the strength of evidence is comparatively less, this article tries to address the relevant gap in the available literature regarding the outcomes of the focused didactic training workshop in micrometry skills. The framework, outcomes based on the Moore's model and feedback based on the Kirkpatrick's levels, might serve as a formidable base for conducting similar workshops in other medical schools. The overall results show significant improvement in both knowledge and skills after the workshop, suggesting the benefits of the modified Peyton's approach. Thus, it could be a first-of-its-kind working model to build the research expertise in biomedical arena.
We would like to acknowledge the faculty, residents, and technicians of the Department of Anatomy, JIPMER, and our esteemed delegates for giving their voluntary informed consent for participating in the feedback process.
Financial support and sponsorship
Self-organized funding was collected from registered delegates.
Conflicts of interest
There are no conflicts of interest.
| References|| |
Jaarsma T, Jarodzka H, Nap M, van Merrienboer JJ, Boshuizen HP. Expertise under the microscope: Processing histopathological slides. Med Educ 2014;48:292-300.
Jaarsma T, Boshuizen HP, Jarodzka H, van Merriënboer JJ. To guide or to follow? Teaching visual problem solving at the workplace. Adv Health Sci Educ Theory Pract 2018;23:961-76.
Lynagh M, Burton R, Sanson-Fisher R. A systematic review of medical skills laboratory training: Where to from here? Med Educ 2007;41:879-87.
Nikendei C, Zeuch A, Dieckmann P, Roth C, Schäfer S, Völkl M, et al.
Role-playing for more realistic technical skills training. Med Teach 2005;27:122-6.
Ericsson KA, Krampe RT, Teschromer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev 1993;100:363-406.
Krautter M, Weyrich P, Schultz JH, Buss SJ, Maatouk I, Jünger J, et al.
Effects of Peyton's four-step approach on objective performance measures in technical skills training: A controlled trial. Teach Learn Med 2011;23:244-50.
Lake FR, Hamdorf JM. Teaching on the run tips 5: Teaching a skill. Med J Aust 2004;181:327-8.
Plsek P. Spreading Good Ideas for Better Health Care: A Practical tool Kit, Tools, Perspectives and Information for Health Care Leaders. Vol. 2. VHA Research Series; 2000
Rogers EM. Diffusion of Innovations. 4th
ed.. New York: Free Press; 1995.
Moore DE Jr., Green JS, Gallis HA. Achieving desired results and improved outcomes: Integrating planning and assessment throughout learning activities. J Contin Educ Health Prof 2009;29:1-5.
Janssen-Noordman AM, Merriënboer JJ, van der Vleuten CP, Scherpbier AJ. Design of integrated practice for learning professional competences. Med Teach 2006;28:447-52.
Bates R. A critical analysis of evaluation practice: The Kirkpatrick model and the principle of beneficence. Eval Program Plann 2004;27:341-7.
[Table 1], [Table 2], [Table 3]