Änderungen

Zur Navigation springen Zur Suche springen
K
Kleine Textergänzungen
Zeile 1: Zeile 1:  +
<!-- Dissertationen grundsätzlich mit der folgenden Vorlage "diss" erstellen! -->
 +
<!-- Falls Sie weitere Angaben machen möchten, dann bitte im darauf folgenden Freitext. -->
 +
{{diss
 +
| name= Nathalia Segura-Caballero
 +
| titel = {{Gender differences in spatial thinking training: an analysis based on the RIF 3.0 platform}}
 +
| hochschule=Paris Lodron Universität Salzburg
 +
| jahr = 2022
 +
| typ =
 +
| betreut = Karl Josef FUCHS
 +
| begutachtet =
 +
| download =
 +
| sprache =
 +
| note =
 +
| pruefungam =
 +
| schulart =
 +
| stufe = 
 +
| matheduc =
 +
}}
 +
 +
== Kontext ==
 +
<!-- Navigating with a GPS, parking our car or simply filling our fridges with the groceries are basicactivities of our daily lives. However, in order to fulfil them effortlessly we need the ability to think spatially. Spatial thinking is an umbrella term used to comprise different concepts, such as spatial perception, spatial ability, visual perception and spatial intelligence (Maresch & Sorby, 2021). It refers to:
 +
“the human ability to direct optical stimuli received by the eye into the brain, to be able to interpret these stimuli, to be able to recognize spatial objects, to be able to mentally imagine spatial objects (with or without prior optical stimuli), to be able to manipulate these objects mentally, to be able to imagine taking other perspectives in space, to be able to perceive and interpret motion sequences, and to be able to execute spatial motor movements” (Maresch & Sorby, 2021).
 +
In addition to being necessary to carry out plain tasks (Maresch & Sorby, 2021), spatial thinking is being linked to educational performance in the STEM areas (Science, Technology, Engineering and Mathematics) (Buckley et al., 2018), which are essential to promote economic growth, international competitiveness and job creation (Ismail, 2018).
 +
Nevertheless, one question that arises when talking about spatial thinking is whether one is born with it (nature) or it can be developed (nurture). Fortunately, there is several evidence that verifies that, although there exist a genetic component (reference), spatial thinking can be improved through practice (Taylor & Hutton, 2013) and, according to Newcombe and Frick (2010), this instruction should likely start in the early childhood in order to have the greatest outcomes in the future.
 +
With this very same purpose of training spatial thinking skills, the online platform RIF 2.0 was launched by a group of experts from the Universities of Salzburg and Graz (Austria) in 2019 (Maresch & Müller-Kreutzer, 2021). It offered more than 700 interactive tasks for participants aged 13 and over, and in the framework of less than two years, it counted with the participation of more than 30.000 users. Given its success, the 3.0 version of the platform is being developed, aimed at pre-school and primary school children aged between 4 and 12 years old.
 +
This platform has been developed in the frame of online learning environments, as it is a tool that provides instruction delivered on a digital device (Mayer, 2019). Online learning is growing as a popular training scenario, as it improves access to education decreasing the temporal and spatial problems of traditional education, while at the same time, reduces its costs (Panigrahi et al., 2018). However, one of the main challenges of online learning is that instruction is often solitary and there is little room for interaction (Jensen et al., 2021). Given this circumstance, feedback appears to play a major role in online learning contexts, as it is one of the few processes in which instructors can assist students in their learning, engage and motivate them (Cavalcanti et al., 2021).
 +
Feedback, comprehended as information provided by an agent about one’s performance or understanding (Hattie, 2010, pp. 147), is a powerful tool through which the learning process can be enhanced (Hattie & Timperley, 2007). More specifically, this type of assessment for learning, is also known as formative assessment (Shute & Rahimi, 2017). In contrast to summative assessment (or assessment of learning), which gathers information and delivers it in form of grades, certification and similar after the instruction, formative assessment involves supporting the teaching and learning process, incorporating feedback during the instruction. Through formative feedback, students can not only identify and correct errors by developing more efficient strategies (Van der Kleij et al., 2015), but it also has a great influence on achievement, as it is a motivating learning factor (Shute, 2008). However, feedback effects may vary greatly, since it is conditioned by different factors, such as the level at which feedback is directed (Hattie & Timperley, 2007); the type of feedback (Shute, 2008); the timing of feedback (Candel et al., 2021), among others. These factors are described below.
 +
Levels of feedback
 +
Hattie and Timperley (2007) describe four levels at which feedback can be aimed: the task, the process, the self-regulation and the self. Feedback at the task level has a corrective function, for example, it would inform students if their answer is correct or incorrect. Feedback at the process level intends to clarify the process that needs to be followed to accomplish the task, such as providing cues of the steps to take. Feedback at the self-regulation level addresses students’ way of directing their own learning process, as prompting them to check if they have used all the information they have to answer the questions. Feedback at the self-level is unrelated to the task performed, but focused on the learner’s personal characteristics (e.g. “Good girl/boy!”). Feedback at the process level seems to be the most effective, as it promotes deeper learning, while feedback directed at the self is the least, because of its uninformative nature.
 +
Feedback types
 +
The type of feedback refers to the content of the feedback message. Shute (2008), carried out a thorough literature review and classified different types of formative feedback based on their complexity -how much and what information is included in the feedback message (pp. 159). The three most commonly used feedback types are (Shute, 2008; Attali & Van der Kleij, 2017):
 +
• knowledge of results (KR): which informs about the correctness of the response;
 +
• knowledge of correct response (KCR): which not only informs if the answer is right or wrong, but also provides the correct answer;
 +
• elaborated feedback (EF): which can take different forms, and provides further information about the correct answer in form of cues, hints, explanations, examples, etc. It is usually accompanied by KR or KCR.
 +
Feedback timing
 +
The timing of feedback concerns the moment of the learning process when the feedback is delivered, and despite having been a broad studied topic, its role in the feedback process is still not properly understood (Attali & Van der Kleij, 2017). Generally, literature write about immediate and delayed feedback (Shute, 2008), however, no clear definitions exist for these terms, as different timings have been used for both. For instance, a study might refer to immediate feedback when it is provided after every answer, when another might apply if for when the test has been finished; and likewise with delayed feedback: it has been used from minutes after finishing, to days or even weeks later. Nonetheless, there seems to be a consensus in the different conditions in which each of them is more beneficial: Immediate feedback appears to be more advantageous for low-ability learners and difficult tasks, while delayed feedback is better for high-ability learners and easy tasks (Attali & Van der Kleij, 2017).
 +
Characteristic of feedback in this work
 +
Considering what has been already explained about feedback, the specific characteristics that are going to define feedback in this work are to be described.
 +
Regarding the levels of feedback, the focus will be on the task and the process. The former is the most common used by teachers and is, in its-self, powerful, while the latter seems to enhance deeper learning (Hattie & Timperley, 2007). As for the feedback type, according to literature, KR is barely beneficial, as it does not provide information about the task. Therefore, KCR and a combination of EF and KCR will be applied for this study. More specifically, EF will be delivered as cues or hints that will assist students when their answers are incorrect (Maier et al., 2016), to make them reconsider their strategies. Finally, concerning the feedback timing, immediate feedback is going to be understood as feedback given after answering an item, while delayed feedback is defined as feedback provided after answering all the items in the assessment (Van der Kleij et al., 2012).
 +
Following this, we will combinethe characteristics will be combined, ending up with five conditions for the project:
 +
1. Immediate KCR
 +
2. Delayed KCR
 +
3. Immediate EF + KCR
 +
4. Delayed EF + KCR
 +
5. No feedback
    +
=== Literatur ===
 +
<!-- Kaplan, B. J., & Weisberg, F. B. (1987). Sex Differences and Practice Effects on Two Visual-Spatial Tasks. Perceptual and Motor Skills, 64(1), 139–142. https://doi.org/10.2466/pms.1987.64.1.139
 +
Kass, S. J., Ahlers, R. H., & Dugger, M. (1998). Eliminating Gender Differences Through Practice in an Applied Visual Spatial Task. Human Performance, 11(4), 337–349. https://doi.org/10.1207/s15327043hup1104_3
 +
Parameswaran, G. (2003). Age, Gender and Training in Children’s Performance of Piaget’s Horizontality Task. Educational Studies, 29(2–3), 307–319. https://doi.org/10.1080/03055690303272
 +
Parameswaran, G., & De Lisi, R. (1996). Improvements in Horizontality Performance as a Function of Type of Training. Perceptual and Motor Skills, 82(2), 595–603. https://doi.org/10.2466/pms.1996.82.2.595
 +
Smith, R. W. (2000). The effects of animated practice on mental rotation tests. https://doi.org/10.25669/VAFH-J7J7
 +
Attali, Y., & Van der Kleij, F. (2017). Effects of feedback elaboration and feedback timing during computer-based practice in mathematics problem solving. Computers & Education, 110, 154–169. https://doi.org/10.1016/j.compedu.2017.03.012
 +
Buckley, J., Seery, N., & Canty, D. (2018). A Heuristic Framework of Spatial Ability: A Review and Synthesis of Spatial Factor Literature to Support its Translation into STEM Education. Educational Psychology Review, 30(3), 947–972. https://doi.org/10.1007/s10648-018-9432-z
 +
Cavalcanti, A. P., Barbosa, A., Carvalho, R., Freitas, F., Tsai, Y.-S., Gašević, D., & Mello, R. F. (2021). Automatic feedback in online learning environments: A systematic literature review. Computers and Education: Artificial Intelligence, 2, 100027, 1-17. https://doi.org/10.1016/j.caeai.2021.100027
 +
Candel, C., Máñez, I., Cerdán, R. & Vidal-Abarca, E. (2021). Delaying elaborated feedback within computer-based learning environments: The role of summative and question-based feedback. Journal of Computer Assisted Learning, 37, 1015-1029. DOI: 10.1111/jcal.12540
 +
Frostig, M. (1977). Visuelle Wahrnehmungsförderung: Materialien: Visuelle Wahrnehmungsförderung: Arbeitsheft 1.
 +
Fyfe, E. R., & Rittle-Johnson, B. (2017). Mathematics practice without feedback: A desirable difficulty in a classroom setting. Instructional Science, 45(2), 177–194. https://doi.org/10.1007/s11251-016-9401-1
 +
Hattie, J. (2010). Visible learning: A synthesis of over 800 meta-analyses relating to achievement (Reprinted). Routledge.
 +
Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
 +
Helwig, M. & Schaadt, S. (2008). Fördermaterial: Visuelle Wahrnehmung – Band 1. Verlag an der Ruhr, Taschenbuch.
 +
Homering, C. & Tram, U. (2014). Fördermaterial: Visuelle Wahrnehmung - Band 2. Verlag an der Ruhr, Taschenbuch.
 +
Ikart, E. M. (2019). Survey Questionnaire Survey Pretesting Method: An Evaluation of Survey Questionnaire via Expert Reviews Technique. Asian Journal of Social Science Studies, 4(2), 1-17. https://doi.org/10.20849/ajsss.v4i2.565
 +
Ismail, Z. (2018). Benefits of STEM Education. 14.
 +
Jensen, L. X., Bearman, M., & Boud, D. (2021). Understanding feedback in online learning – A critical review and metaphor analysis. Computers & Education, 173, 104271, 1-12. https://doi.org/10.1016/j.compedu.2021.104271
 +
Maier, U., Wolf, N., & Randler, C. (2016). Effects of a computer-assisted formative assessment intervention based on multiple-tier diagnostic items and different feedback types. Computers & Education, 95, 85–98. https://doi.org/10.1016/j.compedu.2015.12.002
 +
Maresch, G. (2020). Die Grundroutinen des räumlichen Denkens und Handelns. In Zumbach, J, Maresch, G., Strahl, A., Fleischer, T. (Hrsg.) Neue Impulse in der Naturwissenschaftsdidaktik. Münster: Waxmann. 121-133.
 +
Maresch, G., & Müller-Kreutzer, J. (2021). NEU in RIF 2.0: Nach Alter, Geschlecht und Aufgabenset. 3.
 +
Maresch, G., & Sorby, S. A. (2021). Perspectives on Spatial Thinking. 23.
 +
Mayer, R. E. (2019). Thirty years of research on online learning. Applied Cognitive Psychology, 33(2), 152–159. https://doi.org/10.1002/acp.3482
 +
Newcombe, N. S., & Frick, A. (2010). Early Education for Spatial Intelligence: Why, What, and How. Mind, Brain, and Education, 4(3), 102–111. https://doi.org/10.1111/j.1751-228X.2010.01089.
 +
Panigrahi, R., Srivastava, P. R., & Sharma, D. (2018). Online learning: Adoption, continuance, and learning outcome—A review of literature. International Journal of Information Management, 43, 1–14. https://doi.org/10.1016/j.ijinfomgt.2018.05.005
 +
Shute, V. J. (2008). Focus on Formative Feedback. Review of Educational Research, 78(1), 153–189. https://doi.org/10.3102/0034654307313795
 +
Shute, V. J., & Rahimi, S. (2017). Review of computer-based assessment for learning in elementary and secondary education: Computer-based assessment for learning. Journal of Computer Assisted Learning, 33(1), 1–19. https://doi.org/10.1111/jcal.12172
 +
Taylor, H. A., & Hutton, A. (2013). Think3d!: Training Spatial Thinking Fundamental to STEM Education. Cognition and Instruction, 31(4), 434–455. https://doi.org/10.1080/07370008.2013.828727
 +
Van der Kleij, F. M., Eggen, T. J. H. M., Timmers, C. F., & Veldkamp, B. P. (2012). Effects of feedback in a computer-based assessment for learning. Computers & Education, 58(1), 263–272. https://doi.org/10.1016/j.compedu.2011.07.020
 +
Van der Kleij, F. M., Feskens, R. C. W., & Eggen, T. J. H. M. (2015). Effects of Feedback in a Computer-Based Learning Environment on Students’ Learning Outcomes: A Meta-Analysis. Review of Educational Research, 85(4), 475–511. https://doi.org/10.3102/0034654314564881
Cookies helfen uns bei der Bereitstellung von madipedia. Durch die Nutzung von madipedia erklärst du dich damit einverstanden, dass wir Cookies speichern.

Navigationsmenü