OR WAIT 15 SECS
Now that I have been at UT Arlington for 10 years, I have had many opportunities to refine my own approaches to teaching, in hopes that the learning curve to achieve meaningful learning for students is not as steep as the one I had to manage.
Since I began my independent academic career, I have been highly interested in advancing student learning in the classroom. I was raised in and survived what I would call the traditional education model. Make no mistake-I had excellent professors who taught me the fundamentals of chemistry and other topics-but along the way, I had to teach myself how to effectively learn based on the way the material was taught. Now that I have been at UT Arlington for 10 years, I have had many opportunities to refine my own approaches to teaching, in hopes that the learning curve to achieve meaningful learning for students is not as steep as the one I had to manage. I have written about some of these activities before (1–3), but here I want to convey a slightly different story-one where I have tried repeatedly to disseminate some of the educational research we have performed.
I hardly ever touched a textbook when I was an undergraduate. This might have been part of the reason that I began my first two semesters at the College of William & Mary with 1.65 and 1.80 grade point averages*. I think that probably the more apt explanation was that I did not faithfully attend class; I was rather consumed with playing football, pledging a fraternity, and socializing as higher priorities. Clearly my high school study skills were not going to get me through my courses in college. I also realized that I had little chance of becoming the next Indiana Jones when I received a “D” in Intro to Archaeology. I am not ashamed to admit these things, because I have come a long way since those days. My first revelation was that if I simply went to class, my knowledge on the subject matter improved. I became a feverish note taker and found that if I wrote down everything on the board, as well as everything that the professor said, that I could go back through these notes several times, and refine them into study guides for exams. I still rarely used a textbook, but I became quite good at learning the information from repetitive penning of the material. Some of my classmates hardly took notes, and they still did quite well in class. I realized then that students learn in many different ways-it was a realization that I would take forward for consideration in my own teaching career. Incidentally, I did close out my college career with a couple of 4.0 semesters, happy that I had found my way through some pretty grueling courses, such as Advanced Physical Chemistry and Advanced Inorganic Chemistry. I gave up football after my first year at college, but I still maintained my social life and was able to couple that with academic success.
The notion of providing alternative models for teaching students in high-loss courses (specifically, General Chemistry, Precalculus, and Calculus I) won my colleagues and me a $2 million STEM Talent Expansion Program grant from the National Science Foundation in 2008 (www.uta.edu/aurus). By high-loss, I mean that approximately 50% of students in these courses received a “D,” “F,” or withdrew from the course, and thus they could not go on in their sequence of planned courses. Because UT Arlington has an extremely diverse student population (the fifth most diverse student population of public universities in the country, according to U.S. News and World Report), including over 50% of undergraduates who are the first in their family to attend college, we needed additional programming that could build cohesion, community, and confidence in learning the material. We implemented a Treisman-style Emerging Scholar Program, which provided two to four hours (depending on the course) of additional focused instruction for the students, with activities grounded in the active learning pantheon and meant to build a sense of community for the students as they grappled with challenging course material. We call this a content-intensive collaborative learning (CICL) framework. We tracked progress of students and assessed the program meticulously, in hopes that we would be able to disseminate what we had learned to the rest of the academic community. In general, we were successful in raising the passing rate of students in the course by as much as 15–20%, consistently, over eight semesters. During the course of this project, I was able to mentor a PhD student interested in chemical education research. Not only did he run our CICL program in chemistry, he designed (and tested on more than 300 general chemistry students) a web-based platform for teaching stoichiometry. He tested the effectiveness of different teaching strategies we found in the literature, and we learned some meaningful information.
As we all know, the experiment is never complete until it has been disseminated. Thus, we set about preparing manuscripts to report on both the effectiveness of our CICL program and the results of the chemical education research. For the CICL program, we focused on the aspects of our instructional model that varied over two semesters-and their effects on student success-over two sequential semesters in our Chemistry for Engineers course. Not surprisingly, but just as a tidbit for you, one of the most striking findings was that students were more successful when they actually attended our CICL course. But, there were other important results, such as the fact that with careful programming, we could be just as successful helping students with a two hour per week CICL course as we were with a four hour per week course. We wrote up what I believed to be an excellent account of our efforts (with lots of statistical assessment beyond that which I am myself capable of doing) and submitted it to the Journal of the American Society for Engineering Education. It was promptly rejected. We retooled the manuscript based on the reviewers comments and then submitted it to the Journal of Chemical Education, where it was also rejected. We are now refining it further in hopes that we can find an appropriate home for this manuscript in the peer-reviewed literature.
Around the time that all of this was happening, we also prepared and submitted our first manuscript on the stoichiometry education research to the Journal of Chemical Education. It too was promptly rejected. The reviewers indicated that this research should be split into two separate papers. Fine. If we could get two papers out of the material, then I was all for it. We wised up a bit and enlisted the help of one of our Science Education faculty colleagues, to make sure we were preparing the paper in the proper manner expected by the educational research community. Lesson learned. Apparently, an article formatted in a manner befitting a journal such as Analytical Chemistry (impact factor = 5.8) is not good enough for reporting in the Journal of Chemical Education (impact factor = 1.0). We were willing to try to play by their rules. Our faculty colleague in Science Education helped us split the paper and refine the two new manuscripts, so that our research questions were specifically stated and our findings were formatted acceptably for the educational research world of journal publications. Our educational research colleague has consistently assured us of the quality of the work, and she has published many articles in the science education research literature.
We submitted the first paper to Journal of Chemical Education again, explaining that we had carefully heeded the suggestions from our last unsuccessful submission in preparing this new one. After a lengthy review process, we received three reviews. The first two reviews said to basically publish the manuscript as-is (nice!), whereas the third reviewer wanted major revisions. This immediately made me think of the YouTube video where Adolf Hitler (in subtitles) is lamenting the comments of the third reviewer. If you have never seen it, search for it and take a look-it is pretty darn funny. Anyway, based on the word-count restrictions of the main manuscript body, we had included a pretty lengthy supplemental information document along with the paper. Virtually all of the experimental methods were recounted in the supplementary document. I wrote to the handling editor and indicated to her that most of the changes suggested would probably need to be made to the supplemental document and not the main manuscript, since most of the requested changes were to clarify the methods used in the study. I asked her if this would be okay. She never replied to my query. We proceeded to address all of the changes, make the revisions to the manuscript and supplemental document, and then resubmit it. In one week, we received a notice from the handling editor that the paper was rejected-largely because we had made most of the requested changes to the supplementary document. Go figure. I wrote back to the editor and asked her for reconsideration. I pointed out to her that we had asked how best to handle the revisions, but that she did not respond. We were willing to re-revise the article according to her suggestions, if what we had done already was unsatisfactory. She never responded to my email (even when it was also copied to the editor in chief of the journal). Very frustrating.
Resigned that we would not publish this article in the Journal of Chemical Education, we next submitted it to the International Journal of Science Education. At this point, we were outside of the realm of typical educational research journals I knew about, and this one was suggested to be appropriate by our Science Education collaborator. After two months in review, the article was rejected!
I am not one to be completely jaded about anything. I generally do not let failures overly bother me. It is a necessary part of this job. Yet, here to me is clearly a disconnect. Science education researchers often lament that professors teaching science do not use the educational research available to them to improve student learning. I concur with this sentiment. It is a simple thing to open up an educational research journal and find new ideas about how to teach your course and incorporate best practices, but I do not think that a lot of professors do this. In this case, we were science researchers delving into and trying to contribute to science education by disseminating our findings. I cannot help but think that my academic title as the corresponding author (an analytical chemist doing science education research?!), even though we also had a Science Education coauthor, caused the editors and reviewers of our paper to do a double take and think twice about the validity of our work. That is fine, but it begins to feel like we were not given a fair chance. How are we to encourage pure science researchers to engage in scholarly teaching and disseminate their findings appropriately? As someone who has published many hard science articles, I have begun to become jaded-that pursuing educational research is simply not worth the time and effort, relative to my primary analytical chemistry efforts. I am not ready to give up quite yet, but I think something needs to be done. Perhaps I should start a new journal called Science Education Research Performed by Science Researchers (SERPSR-has a nice ring to it, right?). Thanks for reading while I am up on my soap box.
(1) K.A. Schug, “Five Steps in the Evolution of an Instrumental Analysis Course for Enhanced Student Preparation,” The LCGC Blog, February 11, 2015. http://www.chromatographyonline.com/five-steps-evolution-instrumental-analysis-course-enhanced-student-preparation
(2) K.A. Schug, “ My Own March Madness - Four Dissertations, Analytical Chemistry, and Chemical Education,” The LCGC Blog, April 10, 2014. http://www.chromatographyonline.com/lcgc/Blog/The-LCGC-Blog-My-Own-March-Madness---Four-Disserta/ArticleStandard/Article/detail/840100?contextCategoryId=50130?topic=128,129,114
(3) K.A. Schug, “What Is the Optimal Training to Provide to Students Interested in a Career in Industry?” The LCGC Blog, February 13, 2014. http://www.chromatographyonline.com/lcgc/Blog/The-LCGC-Blog-What-Is-the-Optimal-Training-to-Prov/ArticleStandard/Article/detail/835242?contextCategoryId=50130
* For readers unfamiliar with the U.S. educational and grading system, the highest grade is an “A,” worth 4 points, and the lowest passing grade is a “D,” worth 1. An “F” is a failing grade. Thus a grade-point average of 1.65 or 1.80 means the student received grades of “C” and “D.”
Previous blog entries from Kevin Schug: