Critical thinking skills: do we have any? Critical thinking skills of faculty teaching medical subjects in a military environment.
Critical thinking (Study and teaching)
Medical teaching personnel (Practice)
Medical teaching personnel (Training)
Medical colleges (Faculty)
Medical colleges (Practice)
Medical colleges (Training)
Military personnel (Training)
Military personnel (Methods)
|Author:||Hobaugh, Carol F.|
|Publication:||Name: U.S. Army Medical Department Journal Publisher: U.S. Army Medical Department Center & School Audience: Professional Format: Magazine/Journal Subject: Health Copyright: COPYRIGHT 2010 U.S. Army Medical Department Center & School ISSN: 1524-0436|
|Issue:||Date: Oct-Dec, 2010|
|Topic:||Event Code: 200 Management dynamics; 280 Personnel administration|
|Geographic:||Geographic Scope: United States Geographic Code: 1USA United States|
Countless course introductions and administrative announcements
include the phrase, "in this course we will not teach you what to
think; rather you will be taught how to think." It may be declared
so often as to become cliche, however, that goal is not necessarily
realized for any number of reasons. Two major issues related to the
definition and to the teaching of critical thinking were addressed in 2
important studies that informed my own doctoral research *:
* Expert Consensus on Critical Thinking: The American Philosophical Association (APA) Delphi Report (1990). In this landmark study, a panel of 46 well-known scholars and experts in the field collaborated over a period of 20 months to articulate an international expert consensus definition of critical thinking:
* California Teacher Preparation for Instruction in Critical Thinking (1997): The compelling study, commissioned by the California Commission on Teacher Credentialing, determined that faculty subjects were indeed confident that they understood the critical thinking concepts and were also successful in teaching. Paul states unequivocally that they were also wrong; fewer than 10% of the teachers actually taught critical thinking, could enumerate critical thinking criteria or standards required of students, or provide clear conceptions of critical thinking. (2(p3))
There are many regulatory mandates and professional guidelines establishing requirements for critical thinking, particularly with a focus on healthcare. Among those are requirements for nursing, physical therapy, respiratory therapy, dentistry, pharmacy laboratory specialties, social work, and clinical psychology. The Pew Health Professions Commission also stipulated critical thinking, reflection, and problem-solving skills among their set of competencies for the 21st century. (3)
In addition to Army training guidance reiterating the necessity for self-aware and adaptive problem-solvers, there emerged a compelling requirement for instructors who not only have relevant experience and skills necessary for training students, but also the ability and expertise to teach critical thinking. (4,5)
Therein lay a problem. Instructors at the US Army Medical Department (AMEDD) Center and School (AMEDDC&S) have been selected for their expertise in the particular subject matter they are to teach. However, they do not generally have a background or training in education. Nor have they necessarily been specifically trained in critical thinking skills. This is not unlike any number of situations facing many other teachers. Duldt succinctly describes this quandary:
It was from the Pew Commission's broad based educational perspective that my doctoral study was conceived, and it was situated in the context of the AMEDD. My study measured the critical thinking abilities of instructors with divergent training backgrounds and experience. All instructors assigned to the Academy of Health Sciences were encouraged to participate in the study. The population was 730 potential subjects. The Noncommissioned Officers Academy was not included in this study.
Because this research involved the use of a cognitive test, it was exempt from full institutional review board (IRB) review and continuous monitoring. All protocols associated with this research were submitted to the University of the Incarnate Word (San Antonio, Texas) IRB to determine its status. It was understood that personally identifiable information would remain confidential throughout the research and thereafter. Signed informed consent statements were obtained from each participant at the beginning of each testing session.
Participation in this study was voluntary, and at all times subjects retained the option to refuse to participate and withdraw from the study. Results of the study were reported in the aggregate to protect the identity of individual subjects.
In addition to permission sought from the University of Incarnate Word IRB, letters seeking permission and approval for this study were secured from the AMEDD Clinical Investigation Regulatory Office, the Judge Advocate General, the Fort Sam Houston Army Garrison, the AMEDDC&S Chief of Staff, and the Dean/Commandant of the Academy of Health Sciences, AMEDDC&S.
California Critical Thinking Skills Test (CCTST). Developed by Facione and associates specifically to assess the skills dimension of critical thinking as defined by the APA Delphi Report, the CCTST measured critical thinking characterized as the
The test consisted of 34 multiple-choice questions which were intended to be free of critical thinking jargon and technical vocabulary. No discipline-specific content knowledge was presumed, and although it was designed primarily for post-secondary level assessment, the CCTST had been used successfully across a wide range of subjects from 10th grade to graduate and professional school students.
The CCTST yielded 6 scores: an individual's overall score as well as scores on 5 subscales. The subscales were analysis, evaluation,
inference, deductive reasoning, and inductive reasoning. Facione et al noted that these cognitive skills operate interdependently and interactively rather than as isolated factors; therefore, scores on the subscales were intended primarily as gross indicators of strengths and weaknesses. (7(p13)) The maximum scores for the overall CCTST and each of the subscales were as follows: overall=34, analysis=7, inference = 15, evaluation=12. The last 2 subscales were reclassified and captured as induction=17, and deduction=17. Validation studies of the CCTST Form 2000 produced internal consistency estimates of KR20 ** equal to 0.80 and 0.78. Because more than one critical thinking skill was being tested with a single instrument, reliability ratings of 0.65 to 0.75 should be considered sufficient.7(p15) Having been developed as conceptually consistent with the Delphi expert consensus definition, the CCTST was assumed to have both content and construct validity. Criterion validity was substantiated with correlations to college grade point average, Scholastic Aptitude Test (The College Board, New York, New York) verbal and math scores, and the Nelson-Denny Reading Test (Riverside Publishing Company, Rolling Meadows, Illinois) scores.
A demographic questionnaire was researcher-developed and included requests for information regarding gender, age, ethnicity, rank, military medical specialty designator (military occupational specialty or area of concentration), education level, teaching assignment, assignment experience, deployment experience, and years of service.
Answer sheets were sent to the publisher, Insight Assessment (California Academic Press, Millbrae, California), to be scored. Insight Assessment returned individual test-takers' overall scores as well as scores on each of the subscales. The demographics data were tabulated by the researcher using SPSS and coordinated with CCTST results as they were returned from the publisher. The data were not used to report critical thinking skills of individual instructors. All results were reported in the aggregate to insure anonymity.
The following hypothesis and ancillary hypotheses drove the study.
H1: Instructors of military students have the necessary skills to teach critical thinking skills.
H2: There are differences in the level of critical thinking skills of officer and enlisted instructors.
H3: There are differences in the level of critical thinking skills of instructors teaching leadership courses and those who teach a technical medical specialty.
H4: There are the differences in levels of critical thinking skills among instructors teaching in the various medical military occupational specialties.
H5: There are differences in the level of critical thinking skills of instructors with a field TOE background, those who have primarily a TDA background, and those who have experience in both types of assignments.
H6: There are differences in the level of critical thinking skills of instructors who have combat experience, humanitarian deployment experience, those who have both, and those who have neither type of experience.
Because this study was supported by the leadership of the AMEDDC&S, a great deal of data covering a wide range of variables was collected with a view toward supporting future studies or projects across departments. Data collected specifically for those second and third or ancillary purposes were considered to be tangential to this study. In terms of what is germane, only data specifically relevant to this study were reported. The demographic data, descriptive of the sample, was presented first. Inferential statistics that addressed each of the research hypotheses followed. An alpha ([alpha]) level of .05 was used for all statistical tests.
After a review of the Table of Distribution and Allowances, *** it was estimated that there were 730 instructors in the sample population at the AMEDDC&S at the time of the study. Three hundred fifty-five (49%) individuals actually participated. Nine response sheets were not usable. All remaining 346 participants were instructors at the AMEDDC&S. The sample included 51 officers and 248 noncommissioned officers (NCOs) as well as 34 civil service instructors (Department of the Army employees), and 13 other civilians hired under contract to provide instruction in basic medical skills. The numbers represented a lower percentage of both officer and civilian instructors in the sample and were not necessarily representative of the general AMEDDC&S instructor population. On the other hand, NCO instructors comprised 71% of the sample of 346 subjects, a percentage that is much higher than the actual 38% reflected in the AMEDDC&S population of 730 instructors, and suggested that results were entirely representative.
Age was a demographic of great interest because it represented a developmental or natural maturation process. Therefore it was examined in terms of possible influence on critical thinking skills using a one-way analysis of variance (ANOVA) procedure. While the majority of military instructors at the AMEDDC&S are generally assigned in midcareer, the ages of the instructors in the sample ranged from 26 to over 56 years. The sample was disaggregated into 7 subsets, shown in Table 1. There were no significant differences indicated for the overall CCTST score and the analysis, inference, inductive thinking and deductive thinking subscores. However, the level of significance for the evaluation subscore was 0.035 which was less than the level of 0.05. A subsequent Scheffe post hoc analysis revealed the statistically significant difference occurred between the subset 46-50 years with the lowest mean score of 3.41, and the subset 41-45 years with the highest mean score of 5.03. There were no significant differences among the other subsets.
A series of statements on the demographics survey addressed participants' attitudes and opinions regarding critical thinking skills and established the focus for the study. Table 2 presents the frequencies of responses to indicate participants' levels of agreement, ambivalence, and/or disagreement with the statements.
The sample population was segregated into 3 naturally occurring groups: officer, NCO, and civilian instructors. Data analysis was accomplished by using a chi-square ([chi square]) procedure to test the association among the 3 subsets. There were statistically significant differences in responses of the 3 groups for items 3 and 6, however, the source of the differences was not found.
The preliminary research question for this study asked, "what are the critical thinking skills of instructors at the AMEDDC&S?" The results ofthe 6 ancillary hypotheses derived from this question are presented in the following sections. Where comparisons with a norm group were indicated, results were analyzed by using independent samples t tests to determine significant difference between 2 sample means. The t tests were also used to analyze data generated for Hypothesis 3. Otherwise, an ANOVA procedure was used to analyze results of hypotheses involving multiple groups and CCTST scores, reported by the publisher as an overall score plus 5 subscores reflecting critical thinking skills of analysis, inference, evaluation, inductive thinking, and deductive thinking.
It is important to note that a cut score of n=10 was applied throughout this data analysis. According to Isaac and Michael:
Statistically significant results were demonstrated in 4 of the 6 hypotheses: H1, H2, H4, and H6.
Hypothesis 1: Instructors of military students have the necessary skills to teach critical thinking skills.
It may be generally accepted that instructors, by virtue of the fact that they are instructors, have subject knowledge, skills, and abilities that exceed those of their students. As such, it is often assumed that their critical thinking skills exceed those of their students as well. A means of verifying this assumption would be to assess the critical thinking skills of both groups and compare them. However, there had been no other critical thinking skills CCTST testing of instructors or students at the AMEDDC&S. Therefore, Hypothesis 1 was answered in 3 ways: (a) self-reporting by responding to questions on the demographic survey, (b) completing the CCTST and achieving a score, and (c) by comparing overall CCTST scores and subscores to normed scores.
The previously noted [chi square] test indicated no significant differences among responses to the demographics survey statement "I have the skills sufficient to teach critical thinking." However, it was interesting that the actual mean overall CCTST scores and subscores of those who disagreed or strongly disagreed were slightly higher than the mean scores of the participants in the other subsets. It may be that those who disagreed or strongly disagreed underestimated their own critical thinking abilities or simply wanted to allow themselves some cushion in anticipation of what their actual test results might be. It is possible that those participants who strongly disagreed or disagreed with the question of having sufficient critical thinking skills to teach did so because they did not feel that they would perform well. Conversely, it may be that those who disagreed or strongly disagreed actually had a better understanding of the concept of critical thinking which enabled them to think more globally, to see the larger picture, and to realize what might actually be involved in teaching critical thinking.
What of those participants whose self-report indicated agreement or strong agreement that they had essential critical thinking skills, only to produce lower actual CCTST scores? It is possible that, because critical thinking is not necessarily well understood, participants may have had a limited view of the skills involved, overestimated their abilities, and believed themselves to be good performers. This interpretation would find support in the aforementioned study conducted for the California Commission on Teacher Credentialing, which explored teachers' perceptions and practices related to critical thinking. Paul et al concluded, "it appears likely that we are now training teachers who not only have little understanding of critical thinking nor how to teach for it but also wrongly and confidently think they do." (2(p32))
The self-reported responses to the demographics survey were but one of the ways the critical thinking skills of AMEDDC&S instructors were addressed. The actual CCTST scores and subscores achieved provided a primary measure and were analyzed using norm-referenced interpretation. The sample was disaggregated by education level for comparison with normed scores. Table 3 presents the distribution based on the highest level of education.
There was no single norm group against which to compare the CCTST scores of the participants in the study of AMEDDC&S instructors. Instead, Insight Assessment, the publisher and vendor of the California Critical Thinking Skills Test, provided 2 separate sets of norm scores based on highest level of education: one was a 2-year college group based on scores of 729 students, and the other was a 4-year college group developed from scores of 2677 students. There were no norm scores available for graduate levels or advanced degrees at either the masters or doctoral levels.
Because the education levels of the subjects in this study ranged from high school to post-doctorate, the data were recoded into 4 subsets for analysis. Level one included scores originally designated high school, some college, and associates degree, and it was coded Some College. Level two, coded Baccalaureate, was unchanged. Level three, coded Masters, included both masters and post-masters, and level four, coded Doctorate, included both doctorate and post-doctorate. This allowed for the CCTST mean scores of those in this study with education levels equal to or less than an associates degree to be compared to the mean scores of the 2-year norm group. The CCTST mean scores of those holding a baccalaureate degree were compared to the scores of the 4-year norm group.
Comparison with 2-year norm group. Two hundred ten (61%) of the participants in the AMEDDC&S instructor sample reported an education level equal to or less than 2 years of college. Mean scores of the 2 groups were very similar and t tests revealed no significant differences between the mean scores of the sample population and the mean scores of the norm group (Table 4). Therefore, one may infer that the critical thinking skills of the AMEDDC&S instructors with some college equal to or less than a 2-year associates degree and the skills of the 2-year college students were the same.
Comparison with 4-year norm group. Sixty-nine (20%) of the AMEDDC&S instructor sample reported their highest level of education as the baccalaureate degree. Their scores were compared with norm scores of 2,677 4-year college students (Table 5). Again, because there were no instructor norms available, the 4-year college group was considered to be representative of a peer group. In this case, the t tests revealed statistically significant differences in critical thinking skill levels in the areas of inference and deductive thinking. Further, in all areas, the 4-year college students had higher mean scores than the mean scores of the AMEDDC&S instructor sample in the Baccalaureate group.
One might reasonably ask why it is that college students demonstrated significantly higher levels of certain critical thinking skills than military instructors with equal numbers of educational credits in addition to several years of military experience. The answer may lie in the instructors' own military training backgrounds. It may be that in a structured military training environment, a focus on discreet tasks and technical content framed within specialty areas actually becomes a limiting factor with regard to critical thinking.
Both Kegan (9) and Brookfield (10) asserted that transformative learning through critical thinking/ reflection is not necessarily a goal for all instruction, and that all reflection is not necessarily critical or transformative. Kegan acknowledged differences between transformational and informational learning. Because it deepens or adds to understanding, informational learning may be appropriate for the theory, concepts, principles, and procedures associated with the specificity of military tasks and training.
Finally, with regard to the norm-referenced interpretation of the mean CCTST scores of the Some College and Baccalaureate subsets, it may be important to note that all mean scores of the Baccalaureate subset were higher than the mean scores of the Some College subset (Table 6). This might suggest that there was evidence of some increased cognitive development for those 2 additional years of higher education. This finding is similar to that of Pascarella (11) who reported modest gains in critical thinking abilities for students attending college over students who did not.
Although there were no norm group scores available for those participants in the study who hold advanced degrees, the mean overall CCTST scores and subscores of AMEDDC&S instructors were compared by education levels among and within the 4 subsets, with some surprising results (Table 6). While one might intuitively expect that increasingly higher levels of educational experience would yield higher levels of critical thinking skills, the results of a one-way ANOVA related to the masters degree indicated only one area of significant difference. The means of the overall CCTST scores of those with masters' degrees were significantly higher than were the means of the overall scores of those in the 2-year subset. However, with regard to a comparison of the means of all the CCTST subscores, there were no significant differences among those instructors reporting a master's degree as their highest level of education, those with some college, or those with a baccalaureate degree. Perhaps not so surprising, those with doctorates as their highest level of education had statistically significant higher CCTST scores and all subscores than any of the other participants across all 4 education level subsets.
Given the challenges and academic rigor associated with advanced doctoral level programs, these results are not unexpected. There is an expectation that those who have earned advanced degrees in a discipline have learned the skills of critical thinking in the process.12 Onwuegbuzie (13) compared masters and doctoral-level students using the CCTST which revealed significantly higher critical thinking scores for doctoral students. Nonetheless, doctoral level results aside, the critical thinking skill comparison of those in the Masters subset as indicated by their overall CCTST scores and subscores and those in the Some College and Baccalaureate subsets do raise some questions regarding the nature of their graduate degrees.
Through the Defense Activity for Nontraditional Education Support and the Service members Opportunity Colleges Army Degree (SOCAD) program, military personnel may be given college credit for their military training and/or experience. The Military Evaluation Program, under the auspices of the American Council of Education (ACE), also evaluates military occupational training and publishes credit recommendations in the Guide to the Evaluation of Educational Experiences in the Armed Services (ACE Guide, available at http://militaryguides.acenet.edu/) which serves as a reference to over 3,000 colleges and universities in addition to the SOCAD network. (14)
There were no data collected to indicate the source of academic degrees held by the AMEDDC&S instructors, and, while the comparisons of the participants in the study sample with the norm groups were considered to be appropriate, it was not assumed that credit awarded for military or life-experience represents the same academic experience as that of matriculated students at a college or university. In the former cases there is, in effect, the equivalent of a "training avoidance" not unlike a cost avoidance that occurs when the requirement for an expenditure is eliminated. When individuals are given credit for training they have already had, they do not accrue any benefits that would be derived from additional learning or development. There are also many distance-learning opportunities afforded to the military through programs like eArmyU **** for example. Because courses are offered by a number of institutions, there may not be a consistent philosophy or program of developmental progression which one might expect to find in a traditional 4-year program. Rather, degree plans and programs may be cobbled together out of disparate experiences and training that satisfy administrative requirements, but may not serve to enhance critical thinking skills.
Ironically, civilian educational institutions have also been confronted with issues similar to those noted above:
Weingartner observed that ends and means are often confused, and courses, rather than the learning, are considered to be the goal:
With regard to the value of the baccalaureate degree, the Council for Higher Accreditation (among others) reaffirms the importance of a liberal education in assuring that students are developing the capacity to think and obtaining knowledge necessary to prepare for the future. Indeed, an impetus for this study emerged as a result of the increasing emphasis on critical thinking skills as a component of military education.
Hypothesis 2: There are differences in the level of critical thinking skills of officer and noncommissioned officer instructors.
Because the sample population was naturally segregated into 3 distinct groups of participants, null Hypothesis 2 investigated the critical thinking skill levels not only of instructors who were officers and those who were NCOs, but also of those who were civilians, using an ANOVA procedure (Table 7). The levels of significance for the mean overall CCTST scores and all subscores were 0.000. The Scheffe post hoc analysis indicated the officer instructor group was significantly different from the others and had higher levels of critical thinking scores than did both the NCO and civilian instructors. There was no significant difference between the scores of the civilian and NCO instructors.
In light of the literature which addressed the dominance of the professional officer subculture, perhaps these results are not surprising. (17) Further, the Center for Strategic and International Study Group on Professional Military Education, which focused exclusively on officer education, emphasized increasing intellectual demands. (18) Their recommendations for increased academic education and advanced degrees, particularly in civilian institutions, would be supported by the results of this study. Based on the previous discussion of education levels, the number of advanced degrees held by the officer subset of AMEDDC&S instructors must certainly be regarded as one explanation for the differences. Officer instructors held 43% of the masters' degrees and 92% of the doctorates represented in this study. Conversely, the NCO (87%) and civilian (12%) instructors represented nearly the entirety of those instructors who populated the Some College subset.
The majority of the instructors at the AMEDDC&S are NCOs. Because the results of this study indicated significant differences between the mean CCTST scores and subscores of officers and NCOs, the training and education of NCOs should be considered. It may be that a focus on the training of individual tasks to established standards either limits or precludes the critical reflection necessary for integrated transformative learning. In many respects, NCO training has been conducted from a fairly linear approach, characterized by performance checklists and algorithms and is usually taught in a step-by-step format described as "crawl, walk, run." It may be that these training processes are entirely appropriate, given the ever-increasing demands for the initial training of both warrior skills and medical skills and reactions for the contemporary operational environment. ***** It may be desirable that these medical Soldiers first develop their skills to the level of automaticity. Subsequently, however, it is possible that with increasing implementation of scenario-based training, more explicit training for critical thinking will be realized.
Hypothesis 3: There are differences in the level of critical thinking skills of instructors teaching leadership courses and those who teach a technical medical specialty.
The comparison of mean overall CCTST scores and subscores of those who taught leadership courses and those of medical specialty instructors yielded no significant results (Table 8). It should be noted that mean scores of officer, NCO, and civilian instructors were all included in this comparison. Because officers were often responsible for curriculum development, served as course directors, and, in fact, taught some of the military occupational specialty (MOS) courses, their scores were included in the data analysis. The scores of civilian instructors were also included because they also have teaching responsibilities along with those NCO instructors who hold the MOS.
Of note is the fact that the leadership subset was populated by only 13 subjects. A future study having greater participation of leadership instructors from both the Officer Education System and the Noncommissioned Officer Education System would obviously provide more valuable results.
Hypothesis 4: There are differences in levels of critical thinking skills among instructors teaching in the various medical military occupational specialties.
Statistically significant differences were found in only 2 instances when the mean overall CCTST scores and subscores of instructors in 9 medical military occupational specialties were compared (Table 9). In one case, overall CCTST scores of radiology specialists were higher than those of the practical nurses, while the mean scores of the other MOSs resulted in no difference. In the second case, the mean CCTST induction subscores of the veterinary food inspection specialists were higher than those of the practical nurses, while the mean scores of the remaining MOSs indicated they were statistically the same.
One possible explanation for the differences may be that 25% of the radiology specialists had highest education levels equal to or higher than a baccalaureate degree, whereas the highest education levels of the veterinary food inspectors and practical nurses equal to or higher than a baccalaureate degree were 10% and 6% respectively. Preliminary qualifications for military occupational specialties are determined by scores on the Armed Forces Vocational Aptitude Battery (ASVAB) which is the official mental testing battery used by all US armed services. Groups of ASVAB subtest scores are combined into composite line scores, the most important of which for the majority of the medical enlisted specialties is the Skilled Technical (ST) score, a composite of subtests of general science, word knowledge, paragraph comprehension, mechanical comprehension, and mathematics knowledge. The ST requirements for the radiology specialist (MOS 91P), the veterinary food inspection specialist (MOS 91R), and the practical nurse (MOS 91WM6) are 110, 100, and 101 respectively. If the skilled technical score were indicative of actual critical thinking skills, the results of this comparison might have been anticipated, although a review of the literature indicated that critical thinking is not the same as intelligence, in that critical thinking involves specific skills and disposition. (1,20-22)
Therefore, although the ST scores may be an indication of the mental abilities or cognitive skills required for each MOS, a more complete explanation for differences in critical thinking skill levels may be realized only after further study to examine the differences among the course content and the training processes for all 3 groups. Both the radiology specialists and the veterinary food specialists reported having a great deal of ownership and responsibility for the curriculum and development of their courses. Their branches are fairly small, and content in their courses is challenging and well integrated.
On the other hand, it should be noted that the practical nurses participating in this study were among nearly 300 instructors responsible for training of the healthcare specialists (MOS 91W) in the Department of Combat Medic Training. Instruction is modularized, and instructors do not necessarily have responsibility for curriculum development. This situation may be like those which gave rise to observations regarding undeveloped intellectual capital within an organization when individuals do not have responsibility for either training development or decisions related to the intellectual development of their students. (23,24) There may not be opportunity for intellectual development or challenge for instructors as well. It may also be that the differences in numbers of instructors in small versus large departments impact the culture of their teaching organization and further influences perceptions of individual intellectual investment. Theirs may necessarily be a function of "instructor as messenger" rather than that of instructor/training developer. (25)
The AMEDDC&S is organized into teaching departments based on specialty areas which include the related MOSs. Therefore, another measure of instructor skills in specialty areas involved a comparison of overall mean CCTST scores and subscores across all departments, and included officer, civilian, and NCO instructors. Only departments having 10 or more participants were included in this comparison. Statistically significant differences were found in all instances when the mean overall CCTST scores and subscores of instructors in 9 teaching departments were compared using a one-way ANOVA (Table 10). The instructors of the Department of Veterinary Sciences (DVS) scored higher overall CCTST scores and inference subscores than did the instructors in both the Department of Health Services Administration (DHSA) and the Department of Combat Medic Training (DCMT). In addition, the DVS instructors had higher analysis and deductive thinking subscores than did the DCMT instructors, as well as higher evaluation and inductive thinking subscores than did the instructors in the DHSA.
Based on the earlier discussions of Hypotheses 1 and 2 which found significant differences between overall CCTST scores and subscores of officer and enlisted instructors as well as significant differences associated with advanced degrees at the doctoral level, the differences of critical thinking skill levels within departments might simply be explained by the fact that 36% of the DVS participants were officers, compared with 10% and 5.4% officer representation for DHSA and DCMT respectively. Additionally, the highest level of education equal to or greater than the doctoral level were held by 24% of the subjects representing the DVS, by 5% of the subjects from the DHSA, and 0% from the DCMT. Therefore, the differences in overall CCTST scores and subscores among instructors in the AMEDDC&S teaching departments could be attributed to their rank and their highest levels of education.
On the other hand, those who, like McPeck (26) and Gardner, (20) espouse a domain-specific paradigm of critical thinking, would argue that an analysis of professional knowledge, skill sets, and competencies required for the specialties within each department might better inform an interpretation of the results.
Military Experience. The last 2 hypotheses in this study were addressed by 3 different examinations involving participants' military assignment background, military deployment experience, and years of service. These examinations provided surprisingly similar results. No statistically significant differences were found in comparisons of mean CCTST scores and subscores of instructors by ranges of years of total military service; nor were there differences when types of assignments were compared. Types of military assignment backgrounds were categorized as: (a) primarily Table of Distribution and Allowances (TDA)--to organizations characterized as primarily "institutional," because they are defined by TDAs; (b) primarily Table of Organization and Equipment (TOE)--to organizations characterized as "operational," because they are defined by TOEs; and (c) "both"--background which included TOE as well as TDA assignments. The data is presented in Tables 11 and 12. Finally, in a comparison of types of deployment experience, there were no significant differences found except in the means of CCTST analysis subscores of officers (Tables 13 and 14).
Hypothesis 5: There are differences in the level of critical thinking skills of instructors with a TOE background, those who have primarily a TDA background, and those who have experience in both types of assignments.
The results of the ANOVA indicated no significant difference found. However, it was interesting to note that in both the officer and NCO subsets, actual mean overall CCTST scores and subscores were highest for those instructors having had only TDA assignments. It is possible that the nature of unit training requirements and exercises in an operational TOE environment may somehow limit an individual's range of decisionmaking or problem-solving requirements. This may be especially true when compared with the AMEDD institutional TDA environment in terms of the broad scope of research and medical treatment practiced in Army Medical Centers and teaching hospitals. The potential for more numerous opportunities for further training and continuing education afforded in a TDA assignment should also not be discounted.
Hypothesis 6: There are differences in the level of critical thinking skills of instructors who have combat experience, humanitarian deployment experience, those who have both, and those who have had neither type of experience.
When comparisons of critical thinking skill levels were based on instructors' deployment experience, the only statistically significant difference indicated that the mean CCTST analysis subscores were highest for those officers who had neither combat nor humanitarian deployment experience. In all other instances, the overall CCTST scores and subscores for both officer and NCO instructors, when compared within their respective subsets, were statistically the same regardless of their deployment experience. The actual mean overall CCTST scores and subscores for both officers and NCOs followed similar patterns, with the instructors who have had neither combat nor humanitarian deployment experience having the highest scores in a majority of cases. These results were not inconsistent with the comparisons of TOE and TDA background experience since combat and humanitarian or peacekeeping deployments may be most closely associated with TOE assignments, ie, the operational Army. Likewise, it was not unreasonable to associate those instructors having had no deployment experience with TDA assignments, ie, the institutional Army.
With regard to the issue of combat experience, it should be noted that those in this study of AMEDDC&S instructors who indicated having combat experience were for the most part referring to Desert Shield/Desert Storm of 1991. Since then, and particularly beginning in 2000 with the Army Training and Leader Development Panels, there has been significant study of the Army profession. There has also been considerable study of the differences in the nature of current combat experience in Afghanistan and Iraq from previous combat and the development of expert knowledge. Snider pointed to the number of rotations of soldiers to the Middle East with the observation that "... students at Leavenworth now know more than their faculty." (27) In his study of combat experience of junior officers in Afghanistan and Iraq, Wong describes a cohort of "innovative, confident, and adaptable" lieutenants and captains who have learned to ".actually lead and make decisions rather than merely to execute the orders of higher commands." (28(p13)) It is important to consider whether those lessons learned reflected enhanced critical thinking abilities. It may be that critical thinking skills of AMEDDC&S instructors more recently returned from combat deployment would differ from those who participated in this study. Only further research could determine if such is the case.
While it was not possible to compare results of this study to other specific research of military medical instructors' critical thinking skills, the effects of education on the development of critical thinking abilities could be corroborated. In a 1997 study conducted at the Center for Creative Leadership in Greensboro, NC, Duchesne examined "how developmental learning, and adaptive flexibility relate to the level of critical thinking in sample of organizational leaders." (29(p1)) Duchesne identified the following variables that had been associated with critical thinking in previous research: intelligence, age, years of education, and years of experience. To this list, he added adaptive flexibility and hypothesized there would be no combination of factors that predict the level of critical thinking. An analysis of the data generated by the Watson-Glaser Critical Thinking Appraisal (Pearson Assessment Inc, San Antonio, Texas), the Background Section of the McCauley Job Challenge Profile, (30) and Kolb's Adaptive Style Inventory (Experienced Based Learning Systems, Cleveland, Ohio) indicated "years of education beyond high school" to be the only significant predictor of critical thinking score...." (29(p3)) From another perspective, Pascarella concluded that apart from any specific experience, it was the total integration of intellectual or cognitive development [in college] that positively influenced the development of critical thinking ability. (11(p7))
As noted in the literature, critical thinking is not innate; it must be learned. It must be learned in a way that is intentional and conative,31 and it must be supported by a culture that values critical thinking. There are no established standards of sufficiency with regard to levels of critical thinking skills required of instructors other than an understanding that instructor skills should be higher than those of their students. Certainly, the CCTST could be administered to sample populations of various groups of AMEDDC&S students to facilitate comparisons of critical thinking skill levels with those of instructors and, where appropriate, the norm groups as well. If it is accepted that instructor scores should be higher than student scores, then efforts towards increasing instructor critical thinking skills levels should logically follow.
It is important that critical thinking be understood to involve the quality of one's thinking as opposed to being more narrowly defined as decision-making, problem-solving, creative thinking, intuition, brainstorming, or "thinking outside the box." Major General Maggart made a particularly cogent recommendation at the opening of a 2001 Army Research Institute Critical Thinking Workshop conducted at Ft Leavenworth:
The purpose of this study was to measure critical thinking skills of instructors within the AMEDDC&S, in part, to establish a baseline or frame of reference for subsequent professional development and curriculum design and development, as well as other critical thinking initiatives. Many questions emerged: What will be the trends in critical thinking skill levels of instructors over the next 5 or 10 years? To what extent is the organizational culture supportive of critical thinking? What kinds of experiences impact instructor critical thinking skills? What kind of training or professional development will be required? Do enhanced critical thinking skills of instructors actually lead to greater student success? What are the characteristics of instructors of successful students? What kinds of successful student achievement data could be indicative of instructor critical thinking skills? There may not be easy answers, but it was hoped that the results of this study might provide a starting point for increased interest in collaborative efforts toward development of critical thinking skills as AMEDDC&S intellectual capital, professional development of instructors, and, ultimately, improved student performance.
(1.) Facione PA. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Millbrae, CA: California Academic Press; 1990: ERIC Doc No ED 315 423. Available at: http://www.eric.ed.gov/PDFS/ED315423.pdf. Accessed September 21, 2010.
(2.) Paul R, Elder L, Bartell T. California Teacher Preparation for Instruction in Critical Thinking: Research Findings and Policy Recommendations. Sacramento, California: California Commission on Teacher Credentialing; 1997. Available at: http:// www.criticalthinking.org/store-page.cfm?P= products&ItemID=147&catalogID=214&cateID=132. Accessed September 21, 2010.
(3.) O'Neal EH. Recreating Health Professional Practice for a New Century: The Fourth Report of the Pew Health Professions Commission. San Francisco, CA: The Center for the Health Professions; 1998.
(4.) Army Training and Leader Development Panel Officer Study: Report to the Army. Fort Leavenworth, Kansas: US Army Combined Arms Center; 2001. Available at: http://www.army.mil/features/ATLD/ report.pdf. Accessed September 21, 2010.
(5.) Army Training and Leader Development Panel Noncommissioned Officer Study: Report to the Army. Fort Leavenworth, Kansas: US Army Combined Arms Center; 2002. Available at: http:/www.army. mil/features/atldpnco/nco_study_report.pdf. Accessed September 21, 2010.
(6.) Duldt BW, Coaching winners: how to teach critical thinking. In: Duldt BW, Instructor's Manual: SmartPrim, A Computer Assisted Instructional Program about Critical Thinking, Term Papers, and Speeches. Springfield, VA: Duldt & Associates Inc; 1999;39.
(7.) Facione PA, Facione NC, Blohm SW, Giancarlo CAF. The California Critical Thinking Skills Test Manual. Millbrae, CA: California Academic Press; 2002.
(8.) Isaac S, Michael WB. Handbook in Research and Evaluation for Education and the Behavioral Sciences. 3rd ed. San Diego, California: Educational and Industrial Testing Services; 1997:101.
(9.) Kegan R. What form transforms? A constructive-developmental approach to transformative learning. In: Mezirow J, ed. Learning as Transformation. Critical Perspectives on a Theory in Progress. San Francisco, CA: Jossey-Bass; 2000:35-70.
(10.) Brookfield S. Transformative learning as ideology critique. In: Mezirow J, ed. Learning as Transformation. Critical Perspectives on a Theory in Progress. San Francisco, CA: Jossey-Bass; 2000:125-150.
(11.) Pascarella ET. The development of critical thinking: does college make a difference?. J Coll Student Dev. 1989;30:19-26.
(12.) Schafersman SD. An introduction to critical thinking. Scottsdale Community College Web site. January 1991. Available at: http://smartcollegeplanning.org/ wp-content/uploads/2010/03/Critical-Thinking. Accessed September 21, 2010.
(13.) Onwuegbuzie AJ. Critical thinking skills: a comparison of doctoral and master's level students. Coll Stud J. September 2001. Available at: http://find articles.com/p/articles/mi_m0FCR/is_3_35/ai_80744 661/?tag=content;col1. Accessed September 21, 2010
(14.) Kime SF, Anderson CL. Contributions of the military to adult and continuing education. In: Wilson AL, Hayes ER, eds. Handbook of Adult and Continuing Education. New York: John Wiley and Sons; 2000:471-472.
(15.) Greater Expectations: A New Vision for Learning as a Nation Goes to College. Washington, DC: Association of American Colleges and Universities; 2002:x. Available at: http://greaterexpectations.org/ pdf/GEX.FINAL.pdf. Accessed September 21, 2010.
(16.) Weingartner RH. Undergraduate Education: Goals and Means. Phoenix, Arizona: The Oryx Press; 1992:3.
(17.) Snider DM. An uninformed debate on military culture. In: Lehman JF, Sicherman H, eds. America the Vulnerable: Our Military Problems and How to Fix Them. Philadelphia, PA: Foreign Policy Research Institute; 2000:115-130. Available at: http://www. fpri.org/americavulnerable/BookAmericathe Vulnerable.pdf. Accessed September 21, 2010.
(18.) Cheney RB, Taylor B, eds. Professional Military Education: An Asset for Peace and Progress: A Report of The Csis Study Group On Professional Military Education. Washington, DC: Center for Strategic & International Studies; 1997.
(19.) Joint Publication 1-02: DoD Dictionary of Military and Associated Terms. Washington, DC: Joint Staff, US Dept of Defense; March 4, 2008. Available at: http://www.dtic.mil/doctrine/jel/new_pubs/jp1_02.pdf.
(20.) Gardner H. Multiple Intelligences. New York: BasicBooks; 1993.
(21.) Goleman D. Vital Lies, Simple Truths: The Psychology of Self Deception. New York: Simon & Schuster; 1985.
(22.) Paul R, Elder L. Critical Thinking: Tools for Taking Charge of Your Learning and Your Life. Upper Saddle River, NJ: Prentice Hall; 2001.
(23.) Robinson L. America's secret armies: a swarm of private contractors bedevils the U.S. military. US News & World Report. October 27, 2002. Available at: http://www.usnews.com/usnews/culture/ articles/021104/archive_023164.htm. Accessed September 21, 2010.
(24.) Snider DM, Lira LL. The Future of the Army Profession Phase II. Paper presented at: Senior Leader Conference XLI; September/October 2004; West Point, NY.
(25.) Slusarski SB. Learners' perspectives of the train-the-trainer program in creating the role of classroom trainer. Proceedings of the Annual Adult Education And Research Conference; 1999. Available at: http:// www.edst.educ.ubc.ca/aerc/1999/99slusarski.htm. Accessed September 21, 2010.
(26.) McPeck JE. Critical Thinking and Education. New York: St Martin's Press; 1981.
(27.) Snider DM. The multiple identities of the professional Army officers. In: Snider DM, Matthews LJ, eds. The Future of the Army Profession. New York: McGraw-Hill; 2005:142.
(28.) Wong L. Developing Adaptive Leaders: The Crucible Experience of Operation Iraqi Freedom. Carlisle Barracks, PA: Strategic Studies Institute, US Army War College; 2004. Available at: http://www. strategicstudiesinstitute.army.mil/pubs/display.cfm? PubID=411. Accessed September 21, 2010.
(29.) Duchesne R. Critical thinking, developmental learning, and adaptive flexibility in organizational leaders. Proceedings of the Annual Adult Education And Research Conference; May 16-18, 1997. Available at: http://www.adulterc.org/ Proceedings/1997/97duchesne.htm. Accessed September 21, 2010.
(30.) McCauley CD, Ohlott PJ, Ruderman MN. Job Challenge Profile: Learning from Work Experience. San Francisco, CA: Pfeiffer; 2010.
(31.) Huitt W. Conation as an important factor of mind. Valdosta, GA: Valdosta State University; 1999. Available at: http://www.edpsycinteractive.org/topics/ regsys/conation.html. Accessed September 21, 2010.
(32.) Riedel S. Training Critical thinking Skills for Battle Command. Fort Leavenworth, Kansas. Army Research Institute Newsletter. Spring 2001:7. Available at: http://www.au.af.mil/au/awc/awcgate/army/ ari_spr01_crit.pdf. Accessed September 21, 2010.
Carol F. Hobaugh, PhD
* Dr Hobaugh's dissertation, the basis for this article, was completed in October 2005.
** The Kuder-Richardson Formula 20 (KR20), first published in 1937, is a measure of internal consistency reliability for an overall test consisting of several individual items, each measuring some construct. Source: Nova Southeastern University Center for Psychological Studies (http://www.cps.nova.edu/~cpphelp/class/psy0507/consistency.html).
*** Prescribes the organizational structure, personnel and equipment authorizations, and requirements of a military unit to perform a specific mission for which there is no appropriate table of organization and equipment (the document that defines the structure and equipment for a military organization or unit).
**** eArmyU is an Army online learning portal which provides access for Soldiers to over 100 degree plans at regionally-accredited colleges and universities. Information available at https://www.goarmyed.com/public/public_earmyu-about_earmyu.aspx.
***** The operational environment that exists today and for the clearly foreseeable future. An operational environment is defined in DoD Joint Publication 1-02 19 as "a composite of the conditions, circumstances, and influences that affect the employment of military forces and bear on the decisions of the unit commander."
Dr Hobaugh is Deputy Chief, Staff and Faculty Development Division, Army Medical Department Center and School, Fort Sam Houston, Texas.
We understand critical thinking to be purposeful, self-regulatory judgment that results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based. (1(p3))
We, the faculty, are to teach critical thinking throughout the curriculum to all health care professional students so that new practitioners will be able to function effectively and creatively in the changing arena of health care. Somehow or other, in a manner and by a method not stated or clearly understood, we, the faculty, are to do this. However, we, the faculty, are the very ones, to a significant degree, who are alumni of an educational system which historically has omitted the very things we are now to teach. (6)
... process of purposeful, self-regulatory judgment. This process gives reasoned consideration to evidence, contexts, conceptualizations, methods, and criteria." (7(p2))
... in exploratory research, samples of this size are large enough to test the null hypothesis...[and] remain educationally significant.
In the absence of shared learning goals and clear expectations, a college degree more frequently certifies completion of disconnected fragments than [that] of a coherent plan for student accomplishment. (15)
We require what we believe students need, what they cannot do without; fulfilling requirements, however, soon becomes the whole of education. (16)
... the Army is basically a process driven organization, more interested in standardization of thought than in divergent thought. Critical thinking requires divergent thought, so as you discuss critical thinking over the next 2 days, please try to resist the urge to develop another military decision making process or a command and staff action process ... If you describe critical thinking as an algorithm, that is what the Army will teach and students will learn ... but unfortunately they will not learn how to think critically! (32)
Table 1. Demographic data for the study participants. Sample Subsets Officer NCO Civilian Totals Instructor Status (missing = 0) 51 248 47 346 Gender Male 34 176 27 237 (missing = 20) Female 15 59 15 89 Ethnicity African American 3 64 8 75 (missing = 56) Anglo American, 39 89 22 150 Caucasian Asian American 0 7 0 7 Hispanic, Mexican 2 33 6 41 American, Latino Native American 1 3 1 5 Other 0 11 1 12 Age 26-30 years 2 11 2 15 (missing = 3) 31-35 years 5 98 4 107 36-40 years 12 87 1 100 41-45 years 15 39 5 59 46-50 years 7 10 9 26 51-55 years 4 2 12 18 Over 56 4 1 13 18 Table 2. Frequency data for participants' responses to statements on critical thinking skills. Survey Item Strongly Agree Agree n % n % CT skills are essential for the military 162 (46.8) 127 (36.7) Necessary to teach my class CT skills 76 (22) 161 (46.5) CT already included in my instruction 26 (7.5) 123 (35.5) I have sufficient CT skills to teach CT 32 (9.2) 130 (37.6) My CT skills developed from experience 76 (22) 179 (51.7) My CT skills developed from education 13 (3.8) 91 (26.3) Survey Item Neither Disagree Agree nor Disagree n % n % CT skills are essential for the military 36 (10.4) 2 (0.6) Necessary to teach my class CT skills 72 (20.8) 15 (4.3) CT already included in my instruction 111 (32.1) 54 (15.6) I have sufficient CT skills to teach CT 127 (36.7) 34 (9.8) My CT skills developed from experience 61 (17.6) 3 (0.9) My CT skills developed from education 155 (44.8) 54 (15.6) Survey Item Strongly Missing Disagree n % CT skills are essential for the military 2 (0.6) 17 Necessary to teach my class CT skills 4 (1.2) 18 CT already included in my instruction 9 (2.6) 23 I have sufficient CT skills to teach CT 4 (1.2) 19 My CT skills developed from experience 0 0 27 My CT skills developed from education 3 (0.9) 30 CT indicates Critical Thinking. Table 3. Distribution of sample based on education level. Frequency Percent High School 3 0.9 Some College 116 33.5 Associates Degree 91 26.3 Baccalaureate Degree 69 19.9 Masters Degree 34 9.8 Post-Masters 4 1.2 Doctorate 22 6.4 Post-Doctorate 5 1.4 Missing 2 0.6 Total 346 100.0 Table 4. Descriptive statistics of overall CCTST scores and subscores comparison to the 2-year college norm group (n = 20). Group Overall Analysis Inference n M SD M SD M SD Sample 210 14.71 4.452 4.03 1.47 6.77 2.337 Norm Group 729 14.75 4.922 4.13 1.491 6.80 2.632 Group Evaluation Induction Deduction M SD M SD M SD Sample 14.71 4.452 8.84 2.611 5.87 2.565 Norm Group 14.75 4.922 8.60 2.787 6.14 2.837 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 5. Descriptive statistics of overall CCTST scores and subscores comparison to the 4-year norm group (n = 69). Group Overall Analysis Inference n M SD M SD M SD Subjects 69 15.65 5.387 4.38 1.446 7.10 2.739 Norm Group 2677 16.80 5.062 4.44 1.408 7.85 2.684 Group Evaluation Induction Deduction M SD M SD M SD Subjects 4.17 2.242 9.20 2.978 6.45 3.183 Norm Group 4.52 2.143 9.53 2.821 7.27 2.889 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 6. Descriptive statistics of overall CCTST scores and subscores of instructors compared by education levels (n = 346). Group Overall Analysis Inference n M SD M SD M SD Some College 210 14.71 4.452 4.03 1.470 6.77 2.337 Baccalaureate 69 15.65 5.387 4.38 1.446 7.10 2.739 Masters 38 17.55 5.411 4.61 1.443 8.16 2.805 Doctoral 27 23.22 3.876 5.59 0.844 10.41 2.454 Missing 2 Totals 346 15.88 5.247 4.28 1.483 7.27 2.673 Group Evaluation Induction Deduction M SD M SD M SD Some College 3.91 1.877 8.84 2.611 5.87 2.565 Baccalaureate 4.17 2.242 9.20 2.978 6.45 3.183 Masters 4.79 2.642 10.24 2.889 7.32 2.960 Doctoral 7.22 1.672 12.67 1.732 10.56 2.819 Missing Totals 4.32 2.214 9.37 2.856 6.51 3.029 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 7. Descriptive Statistics of Overall CCTST Scores and Subscores Compared by Instructor Status: Officer, NCO, 1 and Civilian (n = 346). Group Overall Analysis Inference n M SD M SD M SD Officer 51 20.53 5.201 5.06 1.318 9.45 2.633 NCO 248 15.18 4.613 4.13 1.454 7.01 2.378 Civilian 47 14.45 5.785 4.28 1.542 6.26 2.967 Total 346 15.87 5.239 4.29 1.479 7.27 2.668 Group Evaluation Induction Deduction M SD M SD M SD Officer 6.02 2.396 11.43 2.594 9.10 3.195 NCO 4.04 1.998 9.06 2.630 6.12 2.717 Civilian 3.91 2.283 8.74 3.287 5.70 3.035 Total 4.32 2.211 9.37 2.849 6.50 3.031 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 8. Descriptive Statistics for the Overall CCTST Scores and Subscores of Instructors of Leadership and of Medical Skills (n = 346) Group Overall Analysis Inference n M SD M SD M SD Leadership 13 17.38 5.331 4.38 2.022 8.38 2.599 Medical Skills 323 15.84 5.219 4.27 1.457 7.25 2.659 Missing 10 Total 346 Group Evaluation Induction Deduction M SD M SD M SD Leadership 4.62 2.022 10.38 1.938 7.00 3.786 Medical Skills 4.32 2.216 9.34 2.849 6.50 3.016 Missing Total CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 9. Descriptive statistics for Overall CCTST scores and subscores comparisons of instructors by MOS (n = 346) Group Overall Analysis Inference n M SD M SD M SD Radiology 12 16.92 3.554 4.75 0.965 8.25 2.137 Specialist Veterinary Food 10 16.90 4.725 4.70 1.494 7.80 1.989 Inspection Specialist Medical 52 16.73 4.703 4.25 1.440 7.60 2.345 Laboratory Specialist Preventive 13 16.31 4.270 4.46 1.391 7.62 1.981 Medicine Specialist Mental Health 19 15.47 4.683 4.47 1.349 6.89 2.536 Specialist Preventive 13 14.54 4.075 3.62 1.557 7.15 2.703 Dentistry Specialist Health Care 58 13.88 4.967 3.81 1.561 6.50 2.515 Specialist Hospital Food 10 13.30 3.713 4.00 0.667 5.90 2.234 Service Specialist Practical Nurse 16 13.06 3.435 3.38 1.628 6.25 1.770 Totals 203 15.19 4.671 4.09 1.466 7.05 2.390 Group Evaluation Induction Deduction M SD M SD M SD Radiology 3.92 1.975 9.83 2.209 7.08 2.539 Specialist Veterinary Food 4.40 2.459 9.90 2.514 7.00 2.828 Inspection Specialist Medical 4.88 1.957 9.87 2.552 6.87 2.822 Laboratory Specialist Preventive 4.23 2.488 9.77 1.787 6.54 2.847 Medicine Specialist Mental Health 4.11 2.052 9.42 3.288 6.05 2.345 Specialist Preventive 3.77 1.641 9.38 2.181 5.15 2.444 Dentistry Specialist Health Care 3.57 2.010 8.22 2.740 5.66 2.911 Specialist Hospital Food 3.40 1.776 8.00 2.211 5.30 2.359 Service Specialist Practical Nurse 3.44 1.590 7.69 1.922 5.38 2.500 Totals 4.05 2.032 9.05 2.633 6.14 2.756 CCTST indicates California Critical Thinking Skills Test. MOS indicates military occupational specialty. M indicates mean. SD indicates standard deviation. Table 10. Descriptive statistics of overall CCTST scores and subscore comparisons of instructors by teaching departments (n = 346). Group Overall Analysis Inference n M SD M SD M SD Combat Medic 75 13.31 4.832 3.72 1.547 6.03 2.371 Training Clinical 83 16.94 4.516 4.45 1.373 7.78 2.353 Support Services Dental Sciences 26 15.50 4.752 4.27 1.687 6.85 2.679 Health Services 21 13.29 5.159 4.00 1.378 6.19 2.822 Administration Medical Science 52 15.90 4.860 4.31 1.229 7.35 2.504 Preventive 39 17.46 5.481 4.62 1.350 8.00 2.734 Health Services Veterinary 25 19.68 5.505 5.04 1.274 9.12 2.818 Services Leader 13 17.38 5.331 4.38 2.022 8.38 2.599 Training Center Other 10 15.00 5.944 4.80 1.476 6.30 2.830 Missing 2 Totals 346 15.88 5.252 4.30 1.473 7.27 2.676 Group Evaluation Induction Deduction M SD M SD M SD Combat Medic 3.56 2.015 7.96 2.778 5.35 2.658 Training Clinical 4.71 2.027 10.02 2.434 6.92 2.786 Support Services Dental Sciences 4.38 1.899 9.88 2.215 5.62 3.021 Health Services 3.10 2.143 7.76 3.254 5.52 2.502 Administration Medical Science 4.25 2.239 9.06 2.697 6.85 2.933 Preventive 4.85 2.477 10.33 2.950 7.13 3.189 Health Services Veterinary 5.52 2.400 10.96 2.638 8.72 3.348 Services Leader 4.62 2.022 10.38 1.938 7.00 3.789 Training Center Other 3.90 2.424 9.10 3.755 5.90 2.601 Missing Totals 4.31 2.217 9.37 2.855 6.51 3.029 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 11. Descriptive statistics of overall score and subscore comparison of officer instructors by assignment background (n = 51). Group Overall Analysis Inference n M SD M SD M SD TDA 14 22.36 3.319 5.36 7.45 10.21 2.259 Both 34 19.65 5.645 4.94 1.455 9.15 2.743 Missing 3 Total 48 20.44 5.194 5.06 1.295 9.46 2.633 Group Evaluation Induction Deduction M SD M SD M SD TDA 6.79 2.045 12.21 1.672 10.14 2.214 Both 5.56 2.452 11.03 2.876 8.62 3.438 Missing Total 5.92 2.386 11.38 2.623 9.06 3.185 M indicates mean. SD indicates standard deviation. Table 12. Descriptive statistics of overall CCTST score and subscore comparisons of NCO instructors by assignment background (n = 248). Group Overall Analysis Inference n M SD M SD M SD TDA 26 16.85 4.211 4.62 1.444 7.38 2.368 TOE 14 15.36 5.183 3.71 1.069 7.29 2.758 Both 204 15.01 4.577 4.09 1.468 6.99 2.339 Missing 4 Totals 244 15.23 4.591 4.13 1.453 7.05 2.361 Group Evaluation Induction Deduction M SD M SD M SD TDA 4.85 1.826 9.88 2.142 6.96 2.891 TOE 4.36 2.023 9.14 2.983 6.21 2.833 Both 3.93 1.991 8.99 2.628 6.02 2.693 Missing Totals 4.05 1.990 9.09 2.606 6.13 2.726 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 13. Descriptive statistics of overall CCTST score and subscore comparisons of officer instructors by deployment experience (n = 51). Group Overall Analysis Inference n M SD M SD M SD Humanitarian 19 22.00 3.496 5.32 .749 10.21 1.932 Both 11 19.09 5.558 4.27 1.618 8.82 2.960 Neither 14 21.79 4.492 5.64 0.929 9.64 2.818 Missing 14 Totals 51 21.20 4.465 5.16 1.180 9.68 2.513 Group Evaluation Induction Deduction M SD M SD M SD Humanitarian 6.47 1.867 12.16 1.708 9.84 2.387 Both 6.00 2.191 11.00 2.324 8.09 3.700 Neither 6.50 2.139 11.86 2.413 9.93 2.947 Missing Totals 6.36 2.001 11.77 2.112 9.43 2.968 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation. Table 14. Descriptive statistics of overall CCTST score and subscore comparison of NCO instructors by deployment experience (n = 248). Group Overall Analysis Inference n M SD M SD M SD Combat 31 13.77 4.417 3.68 1.681 6.55 2.188 Humanitarian 84 15.48 4.419 4.14 1.381 7.21 2.298 Both 54 15.19 4.589 4.02 1.236 7.20 2.358 Neither 78 15.53 4.821 4.37 1.555 6.92 2.475 Missing 1 Totals 248 15.21 4.593 4.13 1.456 7.04 2.352 Group Evaluation Induction Deduction M SD M SD M SD Combat 3.55 1.670 8.35 2.374 5.42 2.826 Humanitarian 4.12 2.008 9.30 2.592 6.18 2.580 Both 3.96 2.282 9.04 2.642 6.15 2.818 Neither 4.23 1.893 9.19 2.683 6.33 2.762 Missing Totals 4.05 1.997 9.09 2.607 6.13 2.719 CCTST indicates California Critical Thinking Skills Test. M indicates mean. SD indicates standard deviation.
|Gale Copyright:||Copyright 2010 Gale, Cengage Learning. All rights reserved.|