1
|
Rodrigues CP, Duchesne C, Poulin É, Lapointe-Garant PP. In-line cosmetic end-point detection of batch coating processes for colored tablets using multivariate image analysis. Int J Pharm 2021; 606:120953. [PMID: 34329698 DOI: 10.1016/j.ijpharm.2021.120953] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2021] [Revised: 07/23/2021] [Accepted: 07/25/2021] [Indexed: 11/16/2022]
Abstract
In this study, an in-line Process Analytical Technology (PAT) for cosmetic (non-functional) coating unit operations is developed using images of the tablet bed acquired in real-time by an inexpensive industrial camera and lighting system. The cosmetic end-point of multiple batches, run under different operating conditions, is automatically computed from these images using a Multivariate Image Analysis (MIA) methodology in conjunction with a stability determination strategy. The end-points detected by the algorithm differed, on average, by 3% in terms of total batch time from those identified visually by a trained operator. Since traditional practice typically relies on a coating overage to ensure full batch aspect homogeneity in the face of disturbances, the current in-line method can be used to reduce coating material and processing time (over 40% for the operating policy adopted in this work). Additionally, monitoring of the color features calculated by the algorithm allowed the identification of abnormal process conditions affecting visible coating uniformity. This work also addresses practical challenges related to image acquisition in the harsh environment of a pan coater, bringing this tool closer to a state of maturity for implementation in production units and opening the path for their optimization, monitoring, and automatic control.
Collapse
Affiliation(s)
- Cecilia Pereira Rodrigues
- Laboratoire d'observation et d'optimisation des procédés (LOOP), Université Laval, Pavillon Adrien-Pouliot Québec (Québec), G1V 0A6, Canada
| | - Carl Duchesne
- Laboratoire d'observation et d'optimisation des procédés (LOOP), Université Laval, Pavillon Adrien-Pouliot Québec (Québec), G1V 0A6, Canada.
| | - Éric Poulin
- Laboratoire d'observation et d'optimisation des procédés (LOOP), Université Laval, Pavillon Adrien-Pouliot Québec (Québec), G1V 0A6, Canada
| | | |
Collapse
|
2
|
Shen W, Tu Y, Gollub RL, Ortiz A, Napadow V, Yu S, Wilson G, Park J, Lang C, Jung M, Gerber J, Mawla I, Chan ST, Wasan AD, Edwards RR, Kaptchuk T, Li S, Rosen B, Kong J. Visual network alterations in brain functional connectivity in chronic low back pain: A resting state functional connectivity and machine learning study. Neuroimage Clin 2019; 22:101775. [PMID: 30927604 PMCID: PMC6444301 DOI: 10.1016/j.nicl.2019.101775] [Citation(s) in RCA: 56] [Impact Index Per Article: 11.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/23/2018] [Revised: 01/22/2019] [Accepted: 03/10/2019] [Indexed: 01/04/2023]
Abstract
Chronic low back pain (cLBP) is associated with widespread functional and structural changes in the brain. This study aims to investigate the resting state functional connectivity (rsFC) changes of visual networks in cLBP patients and the feasibility of distinguishing cLBP patients from healthy controls using machine learning methods. cLBP (n = 90) and control individuals (n = 74) were enrolled and underwent resting-state BOLD fMRI scans. Primary, dorsal, and ventral visual networks derived from independent component analysis were used as regions of interest to compare resting state functional connectivity changes between the cLBP patients and healthy controls. We then applied a support vector machine classifier to distinguish the cLBP patients and control individuals. These results were further verified in a new cohort of subjects. We found that the functional connectivity between the primary visual network and the somatosensory/motor areas were significantly enhanced in cLBP patients. The rsFC between the primary visual network and S1 was negatively associated with duration of cLBP. In addition, we found that the rsFC of the visual network could achieve a classification accuracy of 79.3% in distinguishing cLBP patients from HCs, and these results were further validated in an independent cohort of subjects (accuracy = 66.7%). Our results demonstrate significant changes in the rsFC of the visual networks in cLBP patients. We speculate these alterations may represent an adaptation/self-adjustment mechanism and cross-model interaction between the visual, somatosensory, motor, attention, and salient networks in response to cLBP. Elucidating the role of the visual networks in cLBP may shed light on the pathophysiology and development of the disorder. We investigated rsFC changes of visual networks in cLBP patients. rsFC of the primary visual network with S1 and M1 increased in cLBP patients. rsFC of the visual networks can differentiate cLBP patients from controls (with 79.3% accuracy). Classification results can be validated in an independent cohort (with 66.7% accuracy).
Collapse
Affiliation(s)
- Wei Shen
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA; First Affiliated Hospital of Hainan Medical College, Hainan Medical University, Haikou, Hainan, China
| | - Yiheng Tu
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Randy L Gollub
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Ana Ortiz
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Vitaly Napadow
- Department of Radiology, Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Charlestown, MA, USA
| | - Siyi Yu
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Georgia Wilson
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Joel Park
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Courtney Lang
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Minyoung Jung
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Jessica Gerber
- Department of Radiology, Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Charlestown, MA, USA
| | - Ishtiaq Mawla
- Department of Radiology, Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Charlestown, MA, USA
| | - Suk-Tak Chan
- Department of Radiology, Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Charlestown, MA, USA
| | - Ajay D Wasan
- Department of Anesthesiology, Center for Pain Research, University of Pittsburgh, Pittsburgh, PA, USA
| | - Robert R Edwards
- Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA
| | - Ted Kaptchuk
- Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA
| | - Shasha Li
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA
| | - Bruce Rosen
- Department of Radiology, Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Harvard Medical School, Charlestown, MA, USA
| | - Jian Kong
- Department of Psychiatry, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA.
| |
Collapse
|
3
|
González I, López-Nava IH, Fontecha J, Muñoz-Meléndez A, Pérez-SanPablo AI, Quiñones-Urióstegui I. Comparison between passive vision-based system and a wearable inertial-based system for estimating temporal gait parameters related to the GAITRite electronic walkway. J Biomed Inform 2016; 62:210-23. [PMID: 27395370 DOI: 10.1016/j.jbi.2016.07.009] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2016] [Revised: 07/04/2016] [Accepted: 07/05/2016] [Indexed: 11/30/2022]
Abstract
Quantitative gait analysis allows clinicians to assess the inherent gait variability over time which is a functional marker to aid in the diagnosis of disabilities or diseases such as frailty, the onset of cognitive decline and neurodegenerative diseases, among others. However, despite the accuracy achieved by the current specialized systems there are constraints that limit quantitative gait analysis, for instance, the cost of the equipment, the limited access for many people and the lack of solutions to consistently monitor gait on a continuous basis. In this paper, two low-cost systems for quantitative gait analysis are presented, a wearable inertial system that relies on two wireless acceleration sensors mounted on the ankles; and a passive vision-based system that externally estimates the measurements through a structured light sensor and 3D point-cloud processing. Both systems are compared with a reference clinical instrument using an experimental protocol focused on the feasibility of estimating temporal gait parameters over two groups of healthy adults (five elders and five young subjects) under controlled conditions. The error of each system regarding the ground truth is computed. Inter-group and intra-group analyses are also conducted to transversely compare the performance between both technologies, and of these technologies with respect to the reference system. The comparison under controlled conditions is required as a previous stage towards the adaptation of both solutions to be incorporated into Ambient Assisted Living environments and to provide continuous in-home gait monitoring as part of the future work.
Collapse
Affiliation(s)
- Iván González
- University of Castilla-La Mancha, Paseo de la Universidad 4, 13071 Ciudad Real, Spain.
| | - Irvin H López-Nava
- Instituto Nacional de Astrofísica, Óptica y Electrónica, Luis Enrique Erro 1, 72840 Puebla, Mexico.
| | - Jesús Fontecha
- University of Castilla-La Mancha, Paseo de la Universidad 4, 13071 Ciudad Real, Spain.
| | - Angélica Muñoz-Meléndez
- Instituto Nacional de Astrofísica, Óptica y Electrónica, Luis Enrique Erro 1, 72840 Puebla, Mexico.
| | | | | |
Collapse
|
4
|
Abstract
It is said that the backbone of Indian economy is agriculture. The contribution of the agriculture sector to the national GDP (Gross Domestic Products) was 14.6% in the year 2010. To attain a growth rate equivalent to that of industry (viz., about 9%), it is highly mandatory for Indian agriculture to modernize and use automation at various stages of cultivation and post-harvesting techniques. The use of computers in assessing the quality of fruits is one of the major activities in post-harvesting technology. As of now, this assessment is majorly done manually, except for a few fruits. Currently, the fruit quality assessment by machine vision in India is still at research level. Major research has been carried out in countries like China, Malaysia, UK, and Netherlands. To suit the Indian market and psychology of Indian farmers, it is necessary to develop indigenous technology. This paper is the first step toward evaluating the research carried out by the research community all over world for tropical fruits. For the purpose of survey, we have concentrated on the tropical fruits of the state of Maharashtra, while keeping in focus of the review image processing algorithms.
Collapse
Affiliation(s)
- Suchitra A Khoje
- a Department of Electronics and Telecommunication Symbiosis International University, Lavale , Pune , 412115 , Maharashtra , India
| | | |
Collapse
|