Dr Luke Miller Radboud University, Nijmegen
Dr Miller is an assistant professor working on sensory-motor control & tool use
Dr Luke Miller had some ☕ at the error bar in episode 20 #tool #brain #hand #silver #iriki #birthday
our discussion
last month saw the 25th anniversary of a paper that has always been important to me & my career. it was published in the week that I started my undergraduate degree.
that paper is by Professor Atsushi Iriki. it was published in the journal NeuroReport on October the second, 1996. it has been extremely influential, at least in my corner of brain science, & has been cited by nearly 1500 other papers.
in their paper, Iriki & colleagues argued that when monkeys have been trained to use a tool, the parts of their brain which normally respond to the monkey's body - or to the space near their body - begin to treat the tool as if it was part of the body. like an extended arm, or a glove. this idea, that tools become part of the body, is as old & as fascinating as neuroscience itself, but Iriki's paper finally made the idea testable.
on this episode, I've asked two scientists to talk about how this paper has influenced their own work, & how the arguments about tools & the body have developed over 25 years. ah! here they come now.
welcome to the error bar, would you like to introduce yourself?
[LM:] hi, i'm Luke Miller, i'm an an assistant professor at the Donders Institute at Radboud University in Nijmegen. & my research focuses on body representation, tool use, & how they intersect.
it's great to have you here. erm, can i get you a drink?
[LM:] since it's the morning, i will take a very strong coffee ☕, to get over my under-caffeination.
excellent.
welcome to the error bar, would you like to introduce yourself?
[MM:] er, hi! i'm Marie Martel & i'm a postdoc at Royal Holloway University of London. my research also focuses on body representation & how it interacts with movement, when your body is growing, so in children, or when your body is stable in size, so you just have to integrate all of the information you get.
excellent. & can i get you a drink?
[MM:] yeah, i think i'll go for some tea, with milk & sugar ☕.
it's the twenty-fifth birthday of Iriki & colleagues' paper, October second, 1996. Luke, do you remember where you where when you first read Iriki's work?
[LM:] so, this would have been in my masters, in, er, in 2010. i was really interested in er, body representation. & i just happened to come across a paper on body representation & tool use. & i thought this was the coolest thing. & they cited Iriki's paper, & so i read that paper, er, & i was just kind of, i think really amazed, that you see this - quote unquote - integration of tools & the body in the brain, & to me that was like, proof that it actually happens, i think there's - i don't think that any more - but at the time i was extremely er, excited about that. & so that kind of put me on the path to researching this, & actually trying to find evidence for whether tools do actually extend the body.
can you tell me just a bit more about, what you remember about the paper?
[LM:] so, er, in the paper they were, i guess, trying to find neural evidence that tools do actually extend the body. so they were recording from neurons in this er, region of the brain, the anterior bank of the intraparietal sulcus, where you have neurons that are bimodal, meaning that they have somatosensory receptive fields on a certain part of the body, let's say the hand or the arm, but also visual receptive fields & these could either, the visual receptive fields could either be around the body part itself, so maybe right around the hand, but there were also a subset of these neurons that had erm, visual receptive fields that kind of covered the, the space around where the reaching could be. & so the question they asked when we, when we, er, record from these neurons in tool-trained monkeys, do you see an expansion of the visual receptive fields around the tool or around the space. & so this is what they actually went to test & found in a large subset of their neurons.
excellent. i rememember feeling the same thing that i was amazed, at first, when i first read this. i think it was in my, my undergraduate research. i think i remember showing it to people & saying 'Look! it's true'. & Marie, how did you first hear about or read this work?
[MM:] um, i think it was a bit later, so i er, probably during the first year of my PhD, that was 2013. i mean, my lab has been quite influenced by this paper, long before i arrived, so i just joined an ongoing project & then discovered this fascinating tool use effect on the kinematics of the movement. & er, then obviously i had to do the whole literature review, so i came across Iriki's paper & er, realised that it was not just the body but also the space.
& how do you think that paper has influenced your, your research later on?
[MM:] well as i said i think it probably was the reason why my lab started investigating tool use, so it was just probably the whole reason i, i got to do a PhD on this. um, but for me it was, erm, my research was a bit more away from this paper. so this is, this was more about how er, tool use modifies space & i focused more on the body side, so how using a tool will extend your arm representation & how it affects your motor control. so i um, yeah, i guess at the basis is this, it was the same, but then we just diverted quite fast.
& you said you joined a lab. which lab was that in particular?
[MM:] i worked with Alice Roy & Alessandro Farnè in Lyon, in France. so yeah, they published this big paper from Lucilla Cardinali in 2009, which was a, i think, one of the first showing that when you use a tool it alters the kinematics of your movement, of your free hand after using the tool. it was quite a huge paper & i came after that, so just following up the project.
i went to the same lab - i think you know, i think you know that - & the reason i went to work with Alessandro is because, because he was my enemy, basically. because i spent my PhD on this same topic, so right from undergraduate i remember the paper & i thought 'wow!, this is amazing', i've got to know more about this, & then i did my PhD on the topic. but then i found myself wanting to go & work with Alessandro, & i was a bit nervous because he was the enemy, because i, i was, sort of, critical of this paper.
[MM:] i think it was the same for Luke as well, right? you joined Alessandro's lab because we all have different views on the process, & somehow we just got all together & er
[LM:] yeah i did my er, my postdoc, my first postdoc, with, with Alessandro. but i think he was probably less of an ememy than he was maybe to Nick. but the "enemy" was obviously in "quotes".
yeah you can't see all the "quotes" being put in the air here, on the, on the audio.
this paper stimulated me to get into this research topic, so although i've written very bad things about this paper over the years, actually, it's the reason i'm still here, at least one of the reasons.
[🎶 whoosh! 🎶]
this next bit has been added after the recording, but we agreed that i could add some criticism of the paper.
if this paper appeared on the error bar in 2021, I would have said that it's a great idea, but...
first, the claim is that after using a tool, there is a different response in the brain to visual stimuli presented near the tool. but the visual stimuli were not controlled - the experimenter waved the stimuli around in front of the monkey, & it's likely that there is some experimenter bias in where the stimuli were waved; & this problem gets worse in their second paper in 2000.
second, there was no actual measurement of the apparent changes in responses - all that we are given as a reader is one or two illustrations about how the responses changed. but the data allowed for a much tighter, clearer, & quantitative analysis of what was going on. why has a proper analysis never been done?
third, no statistical analysis was given - perhaps not surprising given that nothing was measured - but even the proportions of neurons that did or did not respond in the way the authors claimed, were not analysed. I've done my own tests on these proportions, & there's no significant change in the proportions, across the population of neurons. that's worrying!
fourth, even if all of the above problems were addressed, the paper presents very little data to convince the reader that the neurons really are responding to the visual stimuli, & not, for example, movement, touch, air movement, or the monkey's intentions to move. overall, it's an extremely interesting paper, but poorly controlled, & confounded.
now back to the discussion.
[🎶 whoosh! 🎶]i guess my question for both of you is: maybe looking back twenty-five years at these really influential papers, & we've all grown & learnt a lot, & science has moved on, even if, even if we didn't really like this paper or we look back & think it's bad, does that matter? let's go to Luke.
[LM:] so i, i guess in the broader scheme of things, i guess what mattered the most here was how influential it was, it kind of stimulated an entire research field around the er, late nineties, early two thousands, to people to start looking at, i guess taking the question about whether tools actually are integrated parts of the body, seriously. i think that this was, one of the first people to put forward this was Head & Holmes in their 1911 paper, but then it was kind of radio silence for eighty years or so. & so in that sense, i think it's, even though it was, there are some flaws to the paper, i think it was good that it stimulated a research field. but what i would have also liked to see was, let's say, researchers go back & try & do these experiments again, but maybe do them better. but i don't think that has happened. so i think that's rather unfortunate.
do you know if anyone has tried to do it? because i know there is a group in London who should have - or chould have - done this?
[LM:] i'm, i'm not aware actually of any other research that has, let's say, recorded from these multisensory neurons & looked at how they change during, during tool use. i know that the same group has done further studies, like changes in white matter & gene expressions in these areas. but i don't know of any other group that has, let's say, tried to, to replicate this er, this initial finding of Iriki, in monkeys.
& do you think it's, it's important enough to replicate, to try & replicate in monkeys? because these kind of studies will involve killing, killing of monkeys, essentially. is it important enough, do you think, to replicate this work?
[LM:] ah, that's a really tough ethical question. i mean if you put it like - that - it makes me think maybe not. but i, but i would say that, that these monkeys, as far as i know - i'm not a monkey researcher - it isn't like they do this one experiment & then they, they, they kill them. i think that they're typically, they would just be a lab animal for a long period of time. but you could you even make the claim - er, not the claim, like, you could even raise the question, of whether even that's worth it or not. i don't know, that's a really tough ethical question.
yeah, that may have been unfair, i sprung, sprung that on you, that came out of nowhere.
[LM:] no, i mean, it's, i think that's a fair question.
it's how i often think about this. erm, i'd love it if someone re-did that experiment - better - but then i, yeah, i'm, i'm not sure personally if i think it should be done again. because i think we've learnt, different research fields have moved on, & i think maybe we don't need to do it again, maybe it will, it just serves as an idea-generating paper.
[LM:] yeah, i mean you, you could also say that, um, well there's a lot of groups that look at er, reference frames & visuo-motor transformations happening in the parietal cortex. & these always focus on, let's say, using the body. but er, as humans & some species of monkeys, using tools is a, is a very important part of our daily life, & it might be important to understand how the parietal cortex, what also the visual-motor transformations it uses during tool use, & how much are tool-specific, how much are, let's say, re-used from the body, & in this sense it's i guess going just beyond of the extension part, & actually kind of looking at the underlying reference frame transformations. i think that would be important to know, but i also think you could probably do it in humans.
& Marie, how do you feel about looking back at Iriki's original work?
[MM:] i think i have the same opinion as you two, er regarding some of the flaws, some of the technical issues in this paper, but i think, it was just, i think it's enough because it, it opened a way, & inspired so many research, & now we've moved a bit further away from the space question, we're more into body, & sensory, & action. so i don't think it would matter much or help much, to just go back there, because then there's the risk that it opens - again - all these debates about space versus body, & i think we've lost many years focusing on that.
sorry. sorry about that. can you describe this problem a bit more? so, Iriki's paper was titled about the body schema, & the body schema in the old, in the traditional sense is, is more about your body & the proprioception, & the movement, & the touch that you feel, whereas the data from Iriki's paper were about the visual responses & the visual space around the body. so, how do you see those distinctions, you say it's a bit of a waste of time to go back & er. but how do you see it now? what's the problem now that you are trying to solve?
[MM:] personally, i just decided to ignore this question & just focus on er, action & just er, consider that tools extend the body but trying to understand how it works, which sensory inputs are necessary. i think now there's like two different research fields, one that still focuses on how tool use shapes the space, & these are all the studies with, like, audio-visual paradigms, because the peripersonal space is more like a multisensory area around the body, close to the body. & i think the side on which er, both Luke & I are, which is more, when you use a tool, how does it affect your ability to move your body or to feel & to sense.
& can you describe one of your recent pieces of work that addresses this problem?
[MM:] um, yes, so i think the most recent one investigated, erm, how er, this plasticity of body representation emerges & develops in children. so we, we asked um, children & adolescents to um, reach for an object with the hand, & then with a tool, & then again with the hand. & the idea is that, if you compare the kinematics of the movement - so how fast they move & er, when they start to accelerate, decelerate, & so on, open their fingers, when they reach for an object - if you compare er, the patterns before & after tool use, you observe, you observe differences. & typically in adults the differences are in the direction of um, your movement being done as if you had a longer arm. which is why, we said: 'oh, that's the proof' that using a tool actually extends your body schema, because afterwards you move as if you had a longer arm.
[MM:] & we did that with, er, children & adolescents because, in this population, the body representations are not stable: kids are growing, erm, adolescents the same, it's even worse because they have this growth spurt where they grow very fast & er, they gain a lot of centimetres. & so we wanted to see whether that could affect this ability of um, plasticity of the body representation. & what we observed is that er, the adult pattern occurred after puberty was done, so when the body was stable in size, then the late adolescents & young adults would display the same pattern as adults, so they would move as if they had a longer arm after tool use. & in the opposite, er the children had the, the reverse pattern, so they were moving as if they had a shorter arm after tool use.
[MM:] so we haven't quite figured out why yet, er we're just, er, we're just starting to investigate. what we're thinking is that, er, children have to build up a representation & learn how to use a tool, so they rely mostly on vision. & this would be the reason why we get this er, pattern that changes, as compared to adults that rely on proprioception more. & er, yes, in between, at the pubertypic adolescents, at their growth spurt, er, we, erm, observe no kinematic difference at all, so as if when the body is growing very much, then there is no plasticity at all of the body representation. so next we are trying to link that to um, body representation, er, could lead to clumsiness if they are not updated. so we started er, recorded, er recording children & adults with dyspraxia to see whether they could have issues with their body representations.
that sounds fascinating. & do you find children are paying more attention, like visually, are they focussing more on the tool as they are trying to learn it & adults are able to sort of ignore it, almost, & just treat it as if it was part of their body, because we don't look at our fingers very much when we're, when we manipulate things we look at the object. do you find children are doing different things?
[MM:] yeah that's a, that's a good question. not that i recall. i um, even if i say that children were, had to learn how to use a tool, visually when you look at them, when you watch them looking, using tools, they are very good at it & they behave like adults, it's just on the kinematic levels that you see. so, typically, what we do is that we have them train with the tool for like fifteen minutes, & er, if you compare the kinematics of the tool use in adults, at the very, the first few trials & the [few] last you don't see any differences. the, the conclusion that we had from this is that there is no motor learning during tool use in adults, so they as very good at the beginning & they stay the same the whole time. but children actually got better with practice - they, they were faster & faster at using the tool, & this is something that normally you don't need er, for body representation plasticity. there might, it might be that they pay more attention to the object, just they have to, because they, the proprioception is also developing at this age, they might have to rely more on vision.
& Luke, how, how has your own research built on Iriki's work & & gone forward? what, what particular things have you looked at?
[LM:] so, i think that my work & my PhD was maybe much more influenced with the Iriki study than my current work now. in my PhD, i was interested in how tool use might change the underlying body representation. instead of looking at action, like Marie did, I looked more at tactile perception, particularly like tactile distance perception, so if you use, let's say, a long rake for ten minutes, does that change your perception of the distance between two points of touch on the arm. er, & what we, we found was that it, it seemed to change it in the direction that you might expect if the arm representation, er had increased in size, let's say.
[LM:] but where, where the Iriki [paper] comes in, & this is where i think that, even if the study itself was not perfect, it was still very influential, because they found a multisensory effect, so my question was: what are the multisensory effect underlying this change in er, tactile perception. so, i, i focused particularly on vision because i actually was in a vision lab. so i was interested in what role visual feedback might be playing in this effect. so one of, one of the studies, er, that i, that i think is one of the coolest that i did was, i did this er, mirror visual feedback illusion that you [Nick] did a lot in your PhD, actually. so i had participants look into a mirror that was aligned to, let's say, their, their left shoulder, & they put their left arm er, behind the mirror. & so they were going to use the tool with their right arm, but they did so while looking in the mirror. so what they got was visual feedback of their left arm using the tool, even though the left arm was either behind the mirror, or actually we even did a condition where it was hanging down by their side. & what we found is that we found the same effects that we, what we got on the right tool-using arm on the left stationary arm, that was, that was, dependent upon the visual feedback. so if we removed the visual feedback they no longer, we no longer saw the er, transfer.
[LM:] so in that sense, the er, the fact that the original [Iriki] effects were in multisensory neurons i think was very influential, in at least guiding me to look at the more multisensory effects. but one thing that they [Iriki] also found which, which goes against what i just said, was that they didn't find changes in the somatosensory receptive fields of their neurons - or at least that they claim in their paper that they don't - the only changes was in the visual, er, receptive field. but we were, er, finding changes, & i've also done EEG [electroencephalography] that shows changes in the somatosensory responses.
when we, when we learn to use a tool, our whole body has to adapt, right? so, you know, if you, if you buy a new pair of shoes, you've got to adapt, every muscle in your body has got to adapt to those shoes. even things like putting on a put on a pair of glasses or a hat, every muscle in the body will show some, some influence of that. & a colleague at Reading when i was there, he told me about this er, a physiological effect, basically you can record from any muscle in the body, & then just touch any other part of the body, & you'll, you'll find some response. that's, that was on a study of the cerebellum, i think.
[LM:] very interesting.
yeah, that's always erm, made me, well, very interested in the, the idea of how you adapt to new situations, to new dynamics, to new objects.
[LM:] there is this one study that looked at, i believe there were two different conditions for tool use, one was very normal, like, like a cane, they would reach out & like, poke an object or something & find a change in the arm. but then they did a task where they used the cane to kind of, close their eyes & navigate around the rooom. & now they were not only finding the change in tactile perception on the arm, but they found it in the legs as well. so it, it does seem to be in some cases a full-body effect, but that raises the question of why you're getting a change in the first place.
it sounds similar to the, some work on prisms & mirror adaptation studies, where if you, if you divert people's eyes to the left or right with special glasses, some people will, will adapt where they feel their eyes to be, other people will adapt where they feel their neck to be, others it's the shoulder that adapts. & so without measuring every single segment of the body, you can never really be sure what's adapting to the particular experimental situation. so, it must be tough! is it difficult to do this research?
[LM:] i, i think it is difficult to do it comprehensively. so, i should say, one of the things that, that i did in my um, er research was, we looked at changes in the arm & er the hand & found that, that, which body part seemed to change, in terms of the tactile perception, depended on the type of tool you were using. whether it was a, let's say, a long grabber that was kind of more more arm-like, you found it on the arm. & then we had people use this big hand-shaped exoskeleton, & there it only changed the hand but it didn't change the arm. but we weren't very comprehensive there besides just two body parts: we didn't test the upper arm versus the forearm, or the other hand & the other arm, or the legs, or, or what not. so i think, to get a really full picture, if you want to deal with it in a single session it would be quite tough. but maybe that's what people should do.
yeah, so can we, can we answer the question: 'do tools affect the body schema?' is it a simple answer, it's just 'yes, they do', or is it that, to give a comprehensive answer is actually impossible? do you feel, do you feel it's possible to answer this question that Head & Holmes in 1911 or Iriki in 1996, sort of posed, is that even possible? Marie?
[MM:] i don't even think everyone agrees on what is 'body schema' anyway. i, yeah, i don't know, i think, er i think my view would be just to try to integrate the processes, so when you use a tool, what happens in terms of your motor control & how does that fit within the actual model of motor control? & i tend to get away a bit from this body schema question. i don't think i write 'body schema' much in my papers now, i'm trying to, you know, 'body estimate' or 'state estimation of the body'. because, in the end i think we're all speaking about the same thing, & it's just, we're using different terms, & i think that's been an issue in this field for a while, because everything labelled 'body schema' um, was for neuropsychologists, & these are papers that motor control people do not read. & in the end we're all discussing & researching the same thing, but we don't discuss & we don't communicate with each other. so i think - i mean i haven't replied to your question really - but i, i think it's it's more about maybe changing the terms, & just focusing on how that works & what input do you need & what's the point of it, rather than does it work or not.
& Luke?
[LM:] uh, & one of the things that um, most research has done to try & address whether tools are integrated into the body schema, including pretty much all of my own research in my PhD, we're doing these pre-post designs. so, you, you measure something, like you measure tactile perception on the arm, or you measure arm kinematics, or arm bisection. you then have them use tools for a while, & then you do it again & you see a change. but the magic, essentially, is happening during the tool use part, the part that you're not really kind of actually measuring & probing.
[LM:] so one area of research i've been doing lately er, with erm, with Alessandro Farnè during my, my er, my previous postdoc, was um, looking at sensing with tools. & so this actually allows you to measure a er, a behaviour er, during tool use that you would imagine that you just do with your body which is you localise objects on your body, & the question was, can you also localise objects hitting a tool? & how accurate could you actually be with that? i guess the classic example of that would be a blind person navigating with the cane. but even in everyday life we do actually project some tactile sensation onto er, onto tools we use. for example, if you use a, a pencil to, to write on a piece of paper, you can feel the texture of the paper on your pencil, but you don't really feel it at the hands, even though that's where this information enters your nervous system, you actually kind of feel it as if it's on the tip of the pencil itself. & so we actually looked at whether you could actually localise where the hits were happening on the tool, & it turned out you were really good at it - almost perfect at it you might say - which was quite surprising to us. but then we also looked at whether you might be re-using some brain re, er, processes that happen to localise touch on the body to do so on the tool.
[LM:] & we've done er, some EEG experiments that have actually kind of lined these up. so you can er, you see very similar location-specific responses in the brain, for er, touches happening on the arm, as you do for touches happening on a tool. & you can actually use multivariate decoding algorithms to like, train the decode on the arm data, & then decode localisation on the tool. i think that that's some evidence perhaps that at least, even if you don't want to go the route of saying that it has anything to do with the body schema, there's at least some kind of re-use happening, for, for maybe computations your brain has that are body-specific, to now use those for a behaviour that is specific to the tool.
& that's. that's with EEG? so what kind of signals are you getting for localisation in, in EEG signals?
[LM:] the one paper that is published we looked at the er, somatosensory evoked potential. & so we would look at er, at, these different erm, potentials: like, the, the P50 is the positive potential happening at about 50 milliseconds or so, the N80, which is, the, an important one, peaks at around 80 milliseconds, & this supposedly indexes kind of feedback processes from parietal cortex back into S1, & has been linked with some spatial processing, for example localising where touch is in space. & this is actually where we found the bulk of the location-specific signal for the tool & the arm, was, was starting around, let's say, er, 50 or so miliseconds, so really early.
yeah, i'll have to read that. i just assumed, like all good undergraduates would assume, that you can't get much about spatial localisation from, from EEG. but i guess you can.
[LM:] well, so i should say that one thing we did to actually tag space was er, repeitition suppression. so, if we repeated hitting at the same location on the arm, or the same location at the, at the tool, you might expect that in these repetition paradigms, there should be a difference, difference between a trial that is repeated at the same location versus one that's not. so this kind of allowed us to try & find location-specific signals. then there's also the aspect of space in terms of, like, brain space, right? so, taking the EEG signal & projecting it down onto the cortical surface. & it's not great, definitely not as good as MEG [magnetoencephalography] or FMRI [functional magnetic resonance imaging], but it's good enough to get a general idea. so we saw signals that should be in S1, specific to S1. we saw signals that should be more posterior parietal, in posterior parietal. so there does seem to be at least some ability.
Marie do you have anything to chip in?
[MM:] no, i mean i'm, i'm always fascinated by Luke's study, so i'm listening very er, very intensely what he's saying right now. er, but um, yeah, no i, i really hope that at some point we can really understand a bit better what exactly tool use does in your body & how you plan a movement using a tool. uh, so I agree with what Luke was saying earlier, that it's important we start looking at what exactly is happening while you are using a tool. i mean i'm, i'm development oriented, so i'm always interested in what happens in kids, & what happens when kids cannot move properly & have a movement disorder, & i think all of this erm, atypical population are a good way to look at what could happen. so, if you have someone that is blind or someone who has difficulty with proprioception, how does that affect your, your ability to use a tool or to have plastic body representation.
yeah, you've mentioned the debate over space versus body, we shouldn't have had; & maybe we shouldn't be using these terms like body schema. but is there anything else? any particular series of experiments maybe, that hasn't worked, or that never got followed-up, or that can't replicate, is there any, any sort of black holes in this literature that we should be wary of?
[LM:] i'm actually not aware of any studies that are known to have failed to replicate, at least er, in the, in maybe the more body representation focused part, where people focus on whether it changes er the kinematics or the tactile perception or another very common task is arm bisection - so where is the midpoint of your arm. i think where some studies have failed to replicate, which you know quite well, Nick, is in the peripersonal space literature part. different effects, er, for example the multisensory effects along the entire surface of, of the cane or the tool, versus focused more on the hands or the tips. i think, in there there was some failure to replicate a lot of the effects. but i think one reason could be is that i don't think actually a lot of people do research how tools change body representation. & so in some ways, maybe one reason why a lot of these effects haven't been, haven't been shown to replicate or not, is there's not many replicators out there.
yeah, something we've missed from the lockdowns & the pandemic is this sort of chat in conferences, people telling each other about what works & what doesn't, because all we know, really, is what gets into the, the published press. less of a problem now, but do you think that the bigger problem is not many people are going to be working on this topic?
[LM:] i think that could be a problem. i think, another, another problem is that we don't really know how many of these tasks, would actually map onto, what you might call body representation. & so without a good model of the relationship between these tasks & how people do the tasks & the underlying state estimates or representations, it would be hard to know what, what did or did not replicate. for example, uh, the the task where you have people tell you where the midpoint of their arm is. you find that after using a tool that changes, & it seems to distalize - so now the midpoint is kind of further away. but does that mean the arm representation has got longer or that the arm representation is shifted? maybe the whole arm just shifted in space? so, & because there's also conflicting findings there, that some tool use tasks seem to distalise the midpoint, others seem to proximalise it - it's now closer. is that a failure to replicate or is that like a true effect? i think it's really hard to know without knowing these tasks better than people do.
& do you think this is one of the questions for the next twenty-five years?
[LM:] i think in the next twenty-five years it would be really, really helpful to er, let's say, develop computational or formal models of these tasks & the representations, in the same way that the motor control field has kind of formalised the process, or at least partitioned the process of reaching to an object in terms of state estimates & feedback gains & & what not. so that kind of makes it much more concrete. so then if you have this, you can run a, let's say a, tool use task, or you can use, you can do tool use & see how it changes performance on this task, & because you kind of have a computational model of the process that goes into actually performing the behaviour, you can see which aspect of it was modulated. that's how i see the, the future going.
great. & Marie, how do you, how do you think you'll have, you will look back on this next twenty-five years of research. what do you think the big questions are?
[MM:] er, well i quite agree with Luke, i must say, i think, for the last month, every time we chat we're discussing models & tools, & OK, what about about this & this & trying to wrap around, our heads around, OK, what if your arm is longer, then what does this mean in terms of your movement & joint control? & er, so i think we're going the same direction. one other aspect, but it's a bit further away from what we're doing could just be how, using tools for other things, so if you consider that you use a tool as if you would use your arm, & given all the links between like, language & motor, i think there is a very er, interesting field that is starting also in Lyon, now, they're doing studies with the link between language & motor & how tool use can change your language & er, re-educate & helping rehabilitation in different disorders. so there are just, a few PhD students working on that at the moment & they are just starting very fascinating work. so, um, i think it's also a direction we could go.
i spent a lot of erm, yesterday reading this - er - really difficult-to-read review of the ecological approach to tool use. have you had any interactions with ecological tool use people, or ecological perception, like, Gibsonian, affordances? yeah, Luke?
[LM:] i guess i have a couple of things to say about the ecological approach to tool use. i think on, on the one hand, they really focus on, let's say, the physical aspects of tool use. & so, there's a lot of work by Wagman, or Michael Turvey, Claudia Carello, over the last, let's say, er, thirty years or so, that has looked into, OK, what are the physical aspects of a rod, let's say, like the inertia & the torques that would go into actually holding the rod or using the rod & how does that actually map onto the performance, & they've shown that you can actually close your eyes & tell how long the rod is & this seems to relate to the second moment of er, of inertia of the rod, so if you change that moment by adding, let's say, weights or tangents to the rod, that can change their perception.
[LM:] um, so i think that that's really interesting & that's really valuable, & that has actually influenced, in some sense, my work. & i think that's really interesting & that's really imporant because this is what your motor system has to take into account, right, if they want to actually wield the rod, it has to know what the torques are, but it also has to know how that, how these interact - the torques at your shoulder, the torques at your wrist, the torques at your, your elbow, & the interaction torques in order to control it, & so i think that's really important.
[LM:] but where they go wrong is - that's all they focus on. so they try & build an entire theory of tool use around just these physical aspects, & they essentially ignore any kind of internal aspects that might be happening during tool use. they ignore er, coordinate transformations, or they, they ignore re, representation at all. so i think that if, you're only going to get so far if all you're caring about is just the physics & not actually caring about what the nervous system actually does.
yeah i think that was my feeling as well, that there's very clear descriptions of what information is available - out there - in the optic flow, or in the haptic flow, whatever it is.
[LM:] yeah, the, the haptic array, i guess.
so, what is the, what's the brain doing?
yeah exactly, yeah - they kind of stop at the nervous system.
[🎶 sloshed 🎶]one service we offer here at the error bar is absolution for science sins. so, are there any scientific errors or sins that you would like to be forgiven for, Marie?
[MM:] i've, i've not been doing science for a long time, so i don't know if it's er, too early to, to have, i mean i'm still learning. so, i guess one thing could be that i've, at the very beginning i might have been not as interested in to all the Open Science, preregister what you think, make sure other persons from yourself can actually understand your script, & read it. & er, i've been working on that lately, but i'm i think i'm still a little bit far, so depending on the project, you know, you use different statistics, different way of scripting, because you have different co-authors, & it's a bit difficult to find a way around all that, but er, maybe one day.
& Luke, is there anything you want to confess?
[LM:] i think maybe the, the biggest scientific sin, although i guess it depends who you ask whether it's a sin or not, is that, that all of my research - or at least most of my research in my PhD, which was directly related to what aspects of tool use change the body er, representation - were all between-subject designs. & so i would run like, the tool use session that was supposed to have an effect in, like the, like the fifteen subjects, & then i'd run another fifteen subjects, in like, the control. & so, i think that really limits the inferences you can draw, because you're not actually comparing the conditions in the same er, population. & so i think that, er, now all my research always does within-participant design. but i think that was a major flaw, in pretty much, my PhD research.
youre forgiven.
thank you. i feel much better now.