Normalization process theory

From Wikipedia, the free encyclopedia
  (Redirected from Normalization Process Theory)
Jump to: navigation, search

Normalization process theory is a sociological theory of the implementation, embedding, and integration of new technologies and organizational innovations developed by Carl R. May, Tracy Finch, and colleagues. The theory is a contribution to the field of science and technology studies (STS), and is the result of a programme of theory building by May and his co-researchers.[1] Through three iterations, the theory has built upon the Normalization Process Model previously developed by May et al. to explain the social processes that lead to the routine embedding of innovative health technologies.[2][3]

Normalization Process Theory focuses attention on agentic contributions – the things that individuals and groups do to operationalize new or modified modes of practice as they interact with dynamic elements of their environments. It defines the implementation, embedding, and integration as a process that occurs when participants deliberately initiate and seek to sustain a sequence of events that bring it into operation. The dynamics of implementation processes are complex, but Normalization Process Theory facilitates understanding by focusing attention on the mechanisms through which participants invest and contribute to them. It reveals “the work that actors do as they engage with some ensemble of activities (that may include new or changed ways of thinking, acting, and organizing) and by which means it becomes routinely embedded in the matrices of already existing, socially patterned, knowledge and practices.[4]” These have explored objects, agents, and contexts. In a paper published under a creative commons license, May and colleagues describe how, since 2006, NPT has undergone three iterations.[5]” .

  • Objects. The first iteration of the theory focused attention on the relationship between the properties of a complex healthcare intervention and the Collective Action of its users. Here, agents’ contributions are made in reciprocal relationship with the emergent capability that they find in the objects – the ensembles of behavioural and cognitive practices – that they enact. These socio-material capabilities are governed by the possibilities and constraints presented by objects, and the extent to which they can be made workable and integrated in practice as they are mobilized .[6][7]
  • Agents. The second iteration of the theory built on the analysis of Collective Action, and showed how this was linked to the mechanisms through which people make their activities meaningful and build commitments to them .[8] Here, investments of social structural and social cognitive resources are expressed as emergent contributions to social action through a set of generative mechanisms: coherence (what people do to make sense of objects, agency, and contexts); cognitive participation (what people do to initiate and be enrolled into delivering an ensemble of practices); collective action (what people do to enact those practices); and reflexive monitoring (what people do to appraise the consequences of their contributions). These constructs are the core of the theory, and provide the foundation of its analytic purchase on practice.
  • Contexts. The third iteration of the theory developed the analysis of agentic contributions by offering an account of centrally important structural and cognitive resources on which agents draw as they take action [9] Here, dynamic elements of social contexts are experienced by agents as capacity (the social structural resources, that they possess, including informational and material resources, and social norms and roles) and potential (the social cognitive resources that they possess, including knowledge and beliefs, and individual intentions and shared commitments). These resources are mobilized by agents when they invest in the ensembles of practices that are the objects of implementation.

Normalization process theory is a true middle range theory that is located within the 'turn to materiality' in STS. It therefore fits well with the case-study oriented approach to empirical investigation used in STS. It also appears to be a straightforward alternative to actor–network theory in that it does not insist on the agency of non-human actors, and seeks to be explanatory rather than descriptive. However, because Normalization Process Theory specifies a set of generative mechanisms that empirical investigation has shown to be relevant to implementation and integration of new technologies, it can also be used in larger scale structured and comparative studies. Although it fits well with the interpretive approach of ethnography and other qualitative research methods,[10] it also lends itself to systematic review[11][12] and survey research methods. As a middle range theory, it can be federated with other theories to explain empirical phenomena. It is compatible with theories of the transmission and organization of innovations, especially diffusion of innovations theory, labor process theory, and psychological theories including the theory of planned behavior and social learning theory.

References[edit]

  1. ^ May CR, Mair F, Finch T, Macfarlane A, Dowrick C, Treweek S, Rapley T, Ballini L, Ong BN, Rogers A, Murray E, Elwyn G, Légaré F, Gunn J, Montori VM. Development of a theory of implementation and integration: normalization process theory. Implement Sci. 2009 May 21;4:29.
  2. ^ May, C., 2006. A rational model for assessing and evaluating complex interventions in health care. BMC Health Services Research. 6
  3. ^ May, C., et al., 2007. Understanding the implementation of complex interventions in health care: the normalization process model. BMC Health Services Research. 7.
  4. ^ May, C., Finch, T., 2009. Implementation, embedding, and integration: an outline of Normalization Process Theory. Sociology. 43(3), 535-554.
  5. ^ May, C., Sibley, A. and Hunt, K. (2014) 'The nursing work of hospital-based clinical practice guideline implementation: An explanatory systematic review using Normalisation Process Theory', International Journal of Nursing Studies, 51(2), 289-299
  6. ^ May, C., 2006. A rational model for assessing and evaluating complex interventions in health care. BMC Health Services Research. 6
  7. ^ May, C., et al., 2007. Understanding the implementation of complex interventions in health care: the normalization process model. BMC Health Services Research. 7.
  8. ^ May, C., Finch, T., 2009. Implementation, embedding, and integration: an outline of Normalization Process Theory. Sociology. 43, 535-554.
  9. ^ May, C., 2013. Towards a general theory of implementation. Implementation Science. 8, 18.
  10. ^ Gallacher K, May CR, Montori VM, Mair FS. Understanding patients' experiences of treatment burden in chronic heart failure using normalization process theory. Annals of Family Medicine 2011;14(4):351-360.
  11. ^ Mair F, May C, O'Donnell C, Finch T, Sullivan F, Murray E. Factors that promote or inibit the implementation of e-health systems: an explanatory systematic review. Bulletin of the World Health Organisation 2012;90:357-364.
  12. ^ Gallacher K, Jani B, Morrison D, Macdonald S, Blane D, Erwin P, May CR, Montori VM, Eton DT, Smith F, Batty DG, Mair FS. Qualitative systematic reviews of treatment burden in stroke, heart failure, and diabetes - Methodological challenges and solutions. BMC Medical Research Methodology 2013;13(10).