Discussions from the magazine, website, blog and social media
Last issue, David Read reported on research on the impact of flipping organic chemistry teaching. The researchers found a small, but statistically significant, improvement on adoption of the flipped model.
On reading that flipped teaching resulted in improved exam marks, Rissa Sorensen-Unruh (@RissaChem) a chemistry professor in Albuquerque, US, remarked:
Isn’t this a given by now?
@TheOtherDrX, a lecturer in the UK, queried how effective the intervention had been:
Is a Cohen’s D of 0.11 (mean 65 vs 63) really an effect? Do effect sizes get much smaller? I get year on year effects greater than this with no intervention. Get statistical diff variations year on year. Need stronger evidence
The author, David Read (@lowlevelpanic), replied:
I don’t disagree, but I think there’s more to the story than just the stats. Much more evidence needed, for sure!
Yes. Re-reading, it at least supports that flipping is as effective. My dropout rates vary vastly year to year tho.
Tom Worthington, a teacher in the Canberra area, Australia, read further into the story:
What I think is more significant is the 10% reduction in risk of failure, when the course was flipped. A better headline would be ‘flipped classroom helps at-risk students’.
From Taiwan, John Jenkins commented on how flipped teaching has worked for him:
I flipped three university courses last semester and had the best response from my students in nine years. Although I have no statistical analysis to support the improvement of my students, I observed a greater involvement of the students flipping my courses. I also realised that fewer students struggled with the content.
Ho-yin Cheung, an educational researcher in Hong King, said:
I do flipped classroom too and I am excited by the positive results reported in the literature as well as in my own data. In terms of academic research, however, I often find it difficult to differentiate between the impacts brought by flipped classroom and impacts from other factors, such as a more thoughtful course preparation, (possibly) better teacher, better use of technology, and longer study time by the students. I have a feeling that we need to look closer into the effects of different components of flipped classroom. Any ideas?
Also in last issue, David Read described the development and implementation of a novel pedagogy: dynamic problem-based learning (dPBL). In dPBL, groups of students receive different datasets or are provided with different routes through a problem, meaning that each group tackles an individualised problem.
Nicholas Schlotter, associate professor of chemistry at Hamline University, US, wrote:
My concern is that I wonder if the students are ready for the next step(s) in their education. If they go to a traditional organic course how do they do? The dPBL in the article suggests they learn what they need to solve the problem, but whenever one solves a problem you rarely need all the ‘tools of the trade’ for a given problem. Where do they learn the rest of the foundational material in general chemistry in the dPBL program?
Timothy Herzog, associate professor at Weber State University, US, explained why this kind of teaching is important for students:
The most important ‘tool of the trade’ students learn in lower division chemistry classes is the ability to solve new problems. In my opinion most algorithmic learning, in which students follow a series of steps, is easily lost and also easily reproduced again. There are many teaching approaches that require students to develop real problem solving skills, including problem based learning, POGIL, peer led team learning, and others. All of these approaches share a common theme in that students develop their own understanding and develop their own approaches to solving problems. In my experience, it is the students who have these skills that are most successful in higher level classes and not the ones who have learned the most equations.
So Tim, you feel that if the dPBL problem doesn’t cover equilibrium a student would be at no disadvantage going into analytical chemistry? In my experience good students in either environment do well in higher level classes. The real question is what impact does something like dPBL have on the ‘at-risk’ students?
I think that equilibrium would be too big of a topic to ignore in a problem based approach. From what I’ve seen in problem based approaches, they would not skip the core concepts, but would choose problems that would require understanding of important concepts like kinetics, thermodynamics, atomic theory, etc, without explicitly saying ‘this week we will talk about acid base equilibrium’. I do think it would be a disservice to students to go through all of general chemistry without learning about equilibrium concepts, but if they didn’t learn explicitly about Ksp, but understood equilibrium, it would not hurt them down the road since Ksp is merely an extension of other equilibrium problems. I do understand your concern and think that the selection of problems and the guidance of the instructor would be critical in giving the students a breadth of experience while also allowing them to explore based on their curiosity.
Thanks Tim. I guess I’m not convinced that one can trust students to figure out the details of a topic without being guided through the material. I would have to blend the course (mixture of lecture and dPBL) to be sure they cover the basics.
In my opinion, project based learning would actually require more guidance and coaching from the instructor than a traditional class. I use POGIL (process oriented guided inquiry learning) in my classes, and while my lectures are no longer the primary source of information for the students, I am very busy monitoring progress, answering questions, and intervening when students need a push in the right direction. When I teach this way, I know a lot more about what my students actually understand than when I lecture. I think the same would be true of a thoughtfully facilitated project based learning class.
Practical science assessment
In March, Ofqual released the results of its consultation into the assessment of practical science at GCSE. The new approach scraps direct assessment of practical work in favour of assessing knowledge of lab techniques in the final written exam. In a blog post for Education in Chemistry, Mary Whitehouse explained why this is a step forward:
The vast majority of teachers want to do practical work. Currently students may only experience the single experiment required for controlled assessment – the new system has the potential to provide a much richer experience.
David Smith, professor of chemistry at the University of York, UK, was worried that some schools would take shortcuts:
My concern is that in many schools, teachers will simply demonstrate key experiments to students in order to save time, which they can instead focus on teaching to the test. As we see repeatedly, educational activity evolves in response to the nature of the assessment. In the same way, if a school’s position in league tables is no longer dependent on practical science, it will make it all the harder for science teachers to persuade their heads of school to invest in much needed equipment, laboratories and technical support.
Debs Bradders wrote:
Practical work is an integral part of chemistry whether it is assessed or not. I have taught for 30 years and I have used safe sensible practical work for every one of the different types of exam I have delivered: O-level, CSE, GCSE etc. Practical assessment ranged from nothing at all, to tick lists for skills, controlled assessment units, investigations, and ISAs. You don’t need to examine [practical work] to experience it. You don’t need to take part in contrived practical investigations and assessments to understand scientific method. Ask the teachers; just teach science!
Anežka Marie Sokol gave us a view from Denmark:
In Danish high schools all levels of chemistry can be finished by an experimental exam (if the school has the resources) and the exams at all levels of chemistry are focused around the experiments done by the student. The understanding of the curriculum is assessed through the practical and theoretical explanation of the experiment by the student.
Alfredo Tifi, a chemistry teacher in Italy, said the move to assessment of practical skills by written exam will result in teaching that encourages students to develop a deeper understanding of the activities they undertake.
The same students will be prepared to face the intellectual part of grasping, interpreting and deducing new conclusions from experimental data.
Also on the blog, Michael Seery looked at the benefits of using worked examples in problem solving. He described a ‘fading’ approach that allows a student to develop their own structure for tackling questions.
Philip Hamann, visiting professor of chemistry at Vassar College, New York, US, fears the approach needs careful handling:
Worked examples are important, of course, but they risk shallow learning. If we are not careful, students memorise algorithms to do problems without any real understanding of what is behind the solution, even if we tell them the basis for the solution. I had a bright chemistry student who could get every problem correct for which a worked problem had been given, but couldn’t do problems involving all the same concepts and basic steps if it was altered a little. ‘You didn’t show us how to do this one’ was the comment when this student had trouble. I responded, ‘My measure of success is not seeing you solve problems I taught you to solve, but seeing you solve problems I didn’t teach you to solve, but gave you all the tools you need for a solution.’
Michael Seery explained how the method works in practice:
I agree, and this is why I particularly like the fading approach. In fading, as we give students less and less information, they are challenged to see if they can really solve the problem independently or whether they are relying on rote processing. The next iteration is where we change the context – in the case of your student he or she wasn’t making that next jump. In other words I don’t see worked examples as the only tool, but one in a sequence where we equip students to approach topics. Thus we can allow them to use this prior knowledge in approaching problems we didn’t teach them to solve.
Wade Ellis from Utah, US, said:
It seems to me that worked examples give a novice learner something to memorise that they can later use to actually understand what they are doing. The big obstacle seems to be giving the student confidence to try the problem instead of just claiming they don’t know how to even start it.
In March’s issue of Education in Chemistry, we featured massively open online chemistry courses. The article discussed the Introduction to Physical Chemistry MOOC from the University of Manchester. The authors described Introduction to Physical Chemistry as ‘The only chemistry subject MOOC or distance learning course delivered in the UK to an undergraduate audience and is, to the best of our knowledge, the only physical chemistry distance learning course in the world’. While this statement is accurate in reference to MOOCs, the use of the term ‘distance learning’ may have been misleading. Distance learning courses in chemistry have been available for many years from providers such as the Open University.