Sensory integration research: Who is it for?
The March/April issue of AJOT has two articles on sensory integration that are worth discussing.
The first is Verification and clarification of patterns of sensory integrative dysfunction (Mailloux, Mulligan, Smith Roley, et.al.). This article is another factor analysis study that has to be considered in the context of a number of other studies including Ayres (1989) original cluster and factor analyses that went into SIPT standardization, Mulligan's 1998 and 2000 cluster and factor analyses, and the critically appraised topic written by Davies and Tucker (2008).
I'm not sure how many street level practitioners read cluster and factor analysis studies but I don't think that most people put this on top of their reading list. I think this is because we don't spend a lot of time educating practitioners on these methods and what they mean. I personally think that these statistical models are interesting but I also understand that they have a serious fundamental flaw in that they are based on heuristic models of interpretation. In other words, in the case of the SIPT, we are trying to label conditions based on a defined set of variables that supposedly 'make up' a construct that is called 'sensory integration' or perhaps 'praxis.'
The truth is that we are using those 17 tests as a point of convenience even though we have a lot of data that tells us that there are individual problems with some of the reliability of some of those tests. On top of that problem we also have expanded our thinking into more dynamic systems models and to be honest I have no idea how you apply factor analysis inside a world of non-linear dynamics. I guess I know enough to know that I don't have the math background for this kind of thinking.
Maybe it isn't a math problem as much as it is a philosophical problem - and that brings us around to the heuristics problem. I just can't help thinking that we are making contrived conclusions that might not really be a reflection of a full data set. If you go through and read all the factor and cluster analyses and the interpretations of these studies that have been done you will see that factors and clusters have been identified, then clarified and redefined, and in this most recent study we have come full circle to claiming consistency with the orginal conclusion of Ayres.
If there are any street level people reading this stuff they are probably wondering:
1. So which is most 'true' - the Ayres data set or the Mulligan data set or the interpretation of Davies/Tucker or now the Mailloux/Mulligan/et.al. data set.
2. In the 20+ years of variability on conclusions has any of this made a difference anyway to how clinicians are practicing?
3. Is this even in sync with the notion of occupation based practice?
I am concerned that decisions will be made for restandardizing the SIPT based on the heuristic interpretation of these data sets. Since we haven't done a historically good job of even definining what SI is that this is kind of like building a castle on a sand foundation.
All of this leads to the overwhelming question of WHO CARES and WHO IS THIS REALLY WRITTEN FOR ANYWAY? This research has no application to practice. My concern is that in the next 20 years someone else will decide to be an eigenvalue purist who thinks THERE MUST BE a 6 factor solution and they will contribute to another 20 years of gear spinning. Will this bring our practice further along?
******************
On to the next article...
Parham, Smith Roley, May-Benson, et.al. wrote Development of a fidelity measure for research on the effectiveness of the Ayres Sensory Integration Intervention (ASI). This is a long anticipated article that developed a fidelity measure for use in research on ASI. I understand that this is not a practice tool, but the point is to more clearly operationalize our terms and definitions for research - which in theory is supposed to eventually inform our practice.
Structural and process elements were identified but only process elements were validated. Unfortunately, the only people who can tell you if you are appropriately incorporating the process elements are a handful of specially trained experts who defined what the process elements are. This kind of drives the whole fidelity instrument into a ditch of confirmation bias - and really I just don't know what to say about it from that point.
Structural elements are identified but were not validated. Presumably these would be elements that could be more easily confirmed by untrained people. The problem with the structural elements is that you need to have post professional training in SI, there are restrictive space and equipment requirements, and there are requirements for levels of communication that are rarely achieved in many practice settings.
These elements make ASI as it is described as being apropos of nothing, because if only a couple experts can tell you if you are doing it, and if your practice setting precludes the structural elements - then a fidelity measure won't matter much because the model is not applicable to the realities of street level practice. In my thinking, these two articles do not contribute to practice and demonstrate quite clearly that we should re-work the model until we come up with something that reflects actual practice and perhaps incorporates a broader occupation-based framework. While we are at it we might drop those sensory processing interventions that have not been supported by research.
References:
Ayres, A. J. (1989). Sensory Integration and Praxis Tests. Los Angeles: Western Psychological Services.
Davies P. L., Tucker R.(2010) Evidence review to investigate the support for subtypes of children with difficulty processing and integrating sensory information. American Journal of Occupational Therapy 64, 391–402.
Mailloux, Z., Mulligan, S.,; Smith Roley, S., et.al. (2011) Verification and clarification of patterns of sensory integrative dysfunction. American Journal of Occupational Therapy, 65, 143-151.
Mulligan S. (1998). Patterns of sensory integration dysfunction: A confirmatory factor analysis. American Journal of Occupational Therapy, 52, 819–28.
Mulligan, S. (2000). Cluster analysis of scores of children on the Sensory Integration and Praxis Tests. Occupational Therapy Journal of Research, 20(4), 258–270.
Parham, L.D., Smith Roley, S., May-Benson, T.A., et.al. (2011) Development of a fidelity measure for research on the effectiveness of the Ayres Sensory Integration Intervention, American Journal of Occupational Therapy, 65, 133-142.
The first is Verification and clarification of patterns of sensory integrative dysfunction (Mailloux, Mulligan, Smith Roley, et.al.). This article is another factor analysis study that has to be considered in the context of a number of other studies including Ayres (1989) original cluster and factor analyses that went into SIPT standardization, Mulligan's 1998 and 2000 cluster and factor analyses, and the critically appraised topic written by Davies and Tucker (2008).
I'm not sure how many street level practitioners read cluster and factor analysis studies but I don't think that most people put this on top of their reading list. I think this is because we don't spend a lot of time educating practitioners on these methods and what they mean. I personally think that these statistical models are interesting but I also understand that they have a serious fundamental flaw in that they are based on heuristic models of interpretation. In other words, in the case of the SIPT, we are trying to label conditions based on a defined set of variables that supposedly 'make up' a construct that is called 'sensory integration' or perhaps 'praxis.'
The truth is that we are using those 17 tests as a point of convenience even though we have a lot of data that tells us that there are individual problems with some of the reliability of some of those tests. On top of that problem we also have expanded our thinking into more dynamic systems models and to be honest I have no idea how you apply factor analysis inside a world of non-linear dynamics. I guess I know enough to know that I don't have the math background for this kind of thinking.
Maybe it isn't a math problem as much as it is a philosophical problem - and that brings us around to the heuristics problem. I just can't help thinking that we are making contrived conclusions that might not really be a reflection of a full data set. If you go through and read all the factor and cluster analyses and the interpretations of these studies that have been done you will see that factors and clusters have been identified, then clarified and redefined, and in this most recent study we have come full circle to claiming consistency with the orginal conclusion of Ayres.
If there are any street level people reading this stuff they are probably wondering:
1. So which is most 'true' - the Ayres data set or the Mulligan data set or the interpretation of Davies/Tucker or now the Mailloux/Mulligan/et.al. data set.
2. In the 20+ years of variability on conclusions has any of this made a difference anyway to how clinicians are practicing?
3. Is this even in sync with the notion of occupation based practice?
I am concerned that decisions will be made for restandardizing the SIPT based on the heuristic interpretation of these data sets. Since we haven't done a historically good job of even definining what SI is that this is kind of like building a castle on a sand foundation.
All of this leads to the overwhelming question of WHO CARES and WHO IS THIS REALLY WRITTEN FOR ANYWAY? This research has no application to practice. My concern is that in the next 20 years someone else will decide to be an eigenvalue purist who thinks THERE MUST BE a 6 factor solution and they will contribute to another 20 years of gear spinning. Will this bring our practice further along?
******************
On to the next article...
Parham, Smith Roley, May-Benson, et.al. wrote Development of a fidelity measure for research on the effectiveness of the Ayres Sensory Integration Intervention (ASI). This is a long anticipated article that developed a fidelity measure for use in research on ASI. I understand that this is not a practice tool, but the point is to more clearly operationalize our terms and definitions for research - which in theory is supposed to eventually inform our practice.
Structural and process elements were identified but only process elements were validated. Unfortunately, the only people who can tell you if you are appropriately incorporating the process elements are a handful of specially trained experts who defined what the process elements are. This kind of drives the whole fidelity instrument into a ditch of confirmation bias - and really I just don't know what to say about it from that point.
Structural elements are identified but were not validated. Presumably these would be elements that could be more easily confirmed by untrained people. The problem with the structural elements is that you need to have post professional training in SI, there are restrictive space and equipment requirements, and there are requirements for levels of communication that are rarely achieved in many practice settings.
These elements make ASI as it is described as being apropos of nothing, because if only a couple experts can tell you if you are doing it, and if your practice setting precludes the structural elements - then a fidelity measure won't matter much because the model is not applicable to the realities of street level practice. In my thinking, these two articles do not contribute to practice and demonstrate quite clearly that we should re-work the model until we come up with something that reflects actual practice and perhaps incorporates a broader occupation-based framework. While we are at it we might drop those sensory processing interventions that have not been supported by research.
References:
Ayres, A. J. (1989). Sensory Integration and Praxis Tests. Los Angeles: Western Psychological Services.
Davies P. L., Tucker R.(2010) Evidence review to investigate the support for subtypes of children with difficulty processing and integrating sensory information. American Journal of Occupational Therapy 64, 391–402.
Mailloux, Z., Mulligan, S.,; Smith Roley, S., et.al. (2011) Verification and clarification of patterns of sensory integrative dysfunction. American Journal of Occupational Therapy, 65, 143-151.
Mulligan S. (1998). Patterns of sensory integration dysfunction: A confirmatory factor analysis. American Journal of Occupational Therapy, 52, 819–28.
Mulligan, S. (2000). Cluster analysis of scores of children on the Sensory Integration and Praxis Tests. Occupational Therapy Journal of Research, 20(4), 258–270.
Parham, L.D., Smith Roley, S., May-Benson, T.A., et.al. (2011) Development of a fidelity measure for research on the effectiveness of the Ayres Sensory Integration Intervention, American Journal of Occupational Therapy, 65, 133-142.