1
|
Computers make mistakes and AI will make things worse - the law must recognize that. Nature 2024; 625:631. [PMID: 38263299 DOI: 10.1038/d41586-024-00168-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2024]
|
2
|
Zhang X, Shen H, Lv Z. Deployment optimization of multi-stage investment portfolio service and hybrid intelligent algorithm under edge computing. PLoS One 2021; 16:e0252244. [PMID: 34086735 PMCID: PMC8177502 DOI: 10.1371/journal.pone.0252244] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2021] [Accepted: 04/28/2021] [Indexed: 11/19/2022] Open
Abstract
The purposes are to improve the server deployment capability under Mobile Edge Computing (MEC), reduce the time delay and energy consumption of terminals during task execution, and improve user service quality. After the server deployment problems under traditional edge computing are analyzed and researched, a task resource allocation model based on multi-stage is proposed to solve the communication problem between different supporting devices. This model establishes a combined task resource allocation and task offloading method and optimizes server execution by utilizing the time delay and energy consumption required for task execution and comprehensively considering the restriction processes of task offloading, partition, and transmission. For the MEC process that supports dense networks, a multi-hybrid intelligent algorithm based on energy consumption optimization is proposed. The algorithm converts the original problem into a power allocation problem via a heuristic model. Simultaneously, it determines the appropriate allocation strategy through distributed planning, duality, and upper bound replacement. Results demonstrate that the proposed multi-stage combination-based service deployment optimization model can solve the problem of minimizing the maximum task execution energy consumption combined with task offloading and resource allocation effectively. The algorithm has good performance in handling user fairness and the worst-case task execution energy consumption. The proposed hybrid intelligent algorithm can partition tasks into task offloading sub-problems and resource allocation sub-problems, meeting the user's task execution needs. A comparison with the latest algorithm also verifies the model's performance and effectiveness. The above results can provide a theoretical basis and some practical ideas for server deployment and applications under MEC.
Collapse
|
3
|
Pohl S, Ulitzsch E, von Davier M. Using Response Times to Model Not-Reached Items due to Time Limits. PSYCHOMETRIKA 2019; 84:892-920. [PMID: 31054065 DOI: 10.1007/s11336-019-09669-2] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2018] [Indexed: 05/28/2023]
Abstract
Missing values at the end of a test typically are the result of test takers running out of time and can as such be understood by studying test takers' working speed. As testing moves to computer-based assessment, response times become available allowing to simulatenously model speed and ability. Integrating research on response time modeling with research on modeling missing responses, we propose using response times to model missing values due to time limits. We identify similarities between approaches used to account for not-reached items (Rose et al. in ETS Res Rep Ser 2010:i-53, 2010) and the speed-accuracy (SA) model for joint modeling of effective speed and effective ability as proposed by van der Linden (Psychometrika 72(3):287-308, 2007). In a simulation, we show (a) that the SA model can recover parameters in the presence of missing values due to time limits and (b) that the response time model, using item-level timing information rather than a count of not-reached items, results in person parameter estimates that differ from missing data IRT models applied to not-reached items. We propose using the SA model to model the missing data process and to use both, ability and speed, to describe the performance of test takers. We illustrate the application of the model in an empirical analysis.
Collapse
|
4
|
Navarro-Torres A, Alastruey-Benedé J, Ibáñez-Marín P, Viñals-Yúfera V. Memory hierarchy characterization of SPEC CPU2006 and SPEC CPU2017 on the Intel Xeon Skylake-SP. PLoS One 2019; 14:e0220135. [PMID: 31369592 PMCID: PMC6675054 DOI: 10.1371/journal.pone.0220135] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2019] [Accepted: 07/09/2019] [Indexed: 11/18/2022] Open
Abstract
SPEC CPU is one of the most common benchmark suites used in computer architecture research. CPU2017 has recently been released to replace CPU2006. In this paper we present a detailed evaluation of the memory hierarchy performance for both the CPU2006 and single-threaded CPU2017 benchmarks. The experiments were executed on an Intel Xeon Skylake-SP, which is the first Intel processor to implement a mostly non-inclusive last-level cache (LLC). We present a classification of the benchmarks according to their memory pressure and analyze the performance impact of different LLC sizes. We also test all the hardware prefetchers showing they improve performance in most of the benchmarks. After comprehensive experimentation, we can highlight the following conclusions: i) almost half of SPEC CPU benchmarks have very low miss ratios in the second and third level caches, even with small LLC sizes and without hardware prefetching, ii) overall, the SPEC CPU2017 benchmarks demand even less memory hierarchy resources than the SPEC CPU2006 ones, iii) hardware prefetching is very effective in reducing LLC misses for most benchmarks, even with the smallest LLC size, and iv) from the memory hierarchy standpoint the methodologies commonly used to select benchmarks or simulation points do not guarantee representative workloads.
Collapse
|
5
|
Lee DS, Orvell A, Briskin J, Shrapnell T, Gelman SA, Ayduk O, Ybarra O, Kross E. When chatting about negative experiences helps-and when it hurts: Distinguishing adaptive versus maladaptive social support in computer-mediated communication. ACTA ACUST UNITED AC 2019; 20:368-375. [PMID: 30628816 DOI: 10.1037/emo0000555] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Does talking to others about negative experiences improve the way people feel? Although some work suggests that the answer to this question is "yes," other work reveals the opposite. Here we attempt to shed light on this puzzle by examining how people can talk to others about their negative experiences constructively via computer-mediated communication, a platform that people increasingly use to provide and receive social support. Drawing from prior research on meaning-making and self-reflection, we predicted that cueing participants to reconstrue their experience in ways that lead them to focus on it from a broader perspective during a conversation would buffer them against negative affect and enhance their sense of closure compared with cueing them to recount the emotionally arousing details concerning what happened. Results supported this prediction. Content analyses additionally revealed that participants in the reconstrue condition used the word "you" generically (e.g., you cannot always get what you want) more than participants in the recount condition, identifying a linguistic mechanism that supports reconstrual. These findings highlight the psychological processes that distinguish adaptive versus maladaptive ways of talking about negative experiences, particularly in the context of computer-mediated support interactions. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Collapse
|
6
|
King F, Klonoff DC, Kerr D, Hu J, Lyles C, Quinn C, Adi S, Chen K, Hood K, Salber P, de Clercq C, Hu J, Gabbay R. Digital Diabetes Congress 2018. J Diabetes Sci Technol 2018; 12:1231-1238. [PMID: 30376739 PMCID: PMC6232737 DOI: 10.1177/1932296818805632] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Digital health is capturing the attention of the healthcare community. This paradigm whereby healthcare meets the internet uses sensors that communicate wirelessly along with software residing on smartphones to deliver data, information, treatment recommendations, and in some cases control over an effector device. As artificial intelligence becomes more widely used, this approach to creating individualized treatment plans will increase the opportunities for patients, even if they are in remote settings, to communicate with and learn from healthcare professionals. Simple design is needed to promote use of these tools, especially for the purpose of increased adherence to treatment. Widespread adoption by the healthcare industry will require better outcomes data, which will most likely be in the form of safety and effectiveness results from robust randomized controlled trials, as well as evidence of privacy and security. Such data will be needed to convince investors to direct resources into and regulators to clear new digital health tools. Diabetes Technology Society and William Sansum Diabetes Center launched the Digital Diabetes Congress in 2017 because of great interest in determining the potential benefits, metrics of success, and appropriate components of mobile applications for diabetes. The second annual meeting in this series took place on May 22-23, 2018 in San Francisco. This report contains summaries of the meeting's 4 plenary lectures and 10 sessions. This meeting report presents a summary of how 55 panelists, speakers, and moderators, who are leaders in healthcare technology, see the current and future landscape of digital health tools applied to diabetes.
Collapse
|
7
|
Information and Communication Technology (ICT) Standards and Guidelines. Final rule. FEDERAL REGISTER 2017; 82:5790-5841. [PMID: 28102989] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
We, the Architectural and Transportation Barriers Compliance Board (Access Board or Board), are revising and updating, in a single rulemaking, our standards for electronic and information technology developed, procured, maintained, or used by Federal agencies covered by section 508 of the Rehabilitation Act of 1973, as well as our guidelines for telecommunications equipment and customer premises equipment covered by Section 255 of the Communications Act of 1934. The revisions and updates to the section 508-based standards and section 255-based guidelines are intended to ensure that information and communication technology covered by the respective statutes is accessible to and usable by individuals with disabilities.
Collapse
|
8
|
Katz JE. Virtualization of Legacy Instrumentation Control Computers for Improved Reliability, Operational Life, and Management. Methods Mol Biol 2017; 1550:309-324. [PMID: 28188538 DOI: 10.1007/978-1-4939-6747-6_21] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Laboratories tend to be amenable environments for long-term reliable operation of scientific measurement equipment. Indeed, it is not uncommon to find equipment 5, 10, or even 20+ years old still being routinely used in labs. Unfortunately, the Achilles heel for many of these devices is the control/data acquisition computer. Often these computers run older operating systems (e.g., Windows XP) and, while they might only use standard network, USB or serial ports, they require proprietary software to be installed. Even if the original installation disks can be found, it is a burdensome process to reinstall and is fraught with "gotchas" that can derail the process-lost license keys, incompatible hardware, forgotten configuration settings, etc. If you have running legacy instrumentation, the computer is the ticking time bomb waiting to put a halt to your operation.In this chapter, I describe how to virtualize your currently running control computer. This virtualized computer "image" is easy to maintain, easy to back up and easy to redeploy. I have used this multiple times in my own lab to greatly improve the robustness of my legacy devices.After completing the steps in this chapter, you will have your original control computer as well as a virtual instance of that computer with all the software installed ready to control your hardware should your original computer ever be decommissioned.
Collapse
|
9
|
Jarosiewicz B, Sarma AA, Saab J, Franco B, Cash SS, Eskandar EN, Hochberg LR. Retrospectively supervised click decoder calibration for self-calibrating point-and-click brain-computer interfaces. JOURNAL OF PHYSIOLOGY, PARIS 2016; 110:382-391. [PMID: 28286237 PMCID: PMC5591042 DOI: 10.1016/j.jphysparis.2017.03.001] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/21/2016] [Revised: 12/20/2016] [Accepted: 03/01/2017] [Indexed: 11/25/2022]
Abstract
Brain-computer interfaces (BCIs) aim to restore independence to people with severe motor disabilities by allowing control of acursor on a computer screen or other effectors with neural activity. However, physiological and/or recording-related nonstationarities in neural signals can limit long-term decoding stability, and it would be tedious for users to pause use of the BCI whenever neural control degrades to perform decoder recalibration routines. We recently demonstrated that a kinematic decoder (i.e. a decoder that controls cursor movement) can be recalibrated using data acquired during practical point-and-click control of the BCI by retrospectively inferring users' intended movement directions based on their subsequent selections. Here, we extend these methods to allow the click decoder to also be recalibrated using data acquired during practical BCI use. We retrospectively labeled neural data patterns as corresponding to "click" during all time bins in which the click log-likelihood (decoded using linear discriminant analysis, or LDA) had been above the click threshold that was used during real-time neural control. We labeled as "non-click" those periods that the kinematic decoder's retrospective target inference (RTI) heuristics determined to be consistent with intended cursor movement. Once these neural activity patterns were labeled, the click decoder was calibrated using standard supervised classifier training methods. Combined with real-time bias correction and baseline firing rate tracking, this set of "retrospectively labeled" decoder calibration methods enabled a BrainGate participant with amyotrophic lateral sclerosis (T9) to type freely across 11 research sessions spanning 29days, maintaining high-performance neural control over cursor movement and click without needing to interrupt virtual keyboard use for explicit calibration tasks. By eliminating the need for tedious calibration tasks with prescribed targets and pre-specified click times, this approach advances the potential clinical utility of intracortical BCIs for individuals with severe motor disability.
Collapse
|
10
|
Sayer MDJ, Azzopardi E, Sieber A. User settings on dive computers: reliability in aiding conservative diving. Diving Hyperb Med 2016; 46:98-110. [PMID: 27334998] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2016] [Accepted: 04/23/2016] [Indexed: 06/06/2023]
Abstract
INTRODUCTION Divers can make adjustments to diving computers when they may need or want to dive more conservatively (e.g., diving with a persistent (patent) foramen ovale). Information describing the effects of these alterations or how they compare to other methods, such as using enriched air nitrox (EANx) with air dive planning tools, is lacking. METHODS Seven models of dive computer from four manufacturers (Mares, Suunto, Oceanic and UWATEC) were subjected to single square-wave compression profiles (maximum depth: 20 or 40 metres' sea water, msw), single multi-level profiles (maximum depth: 30 msw; stops at 15 and 6 msw), and multi-dive series (two dives to 30 msw followed by one to 20 msw). Adjustable settings were employed for each dive profile; some modified profiles were compared against stand-alone use of EANx. RESULTS Dives were shorter or indicated longer decompression obligations when conservative settings were applied. However, some computers in default settings produced more conservative dives than others that had been modified. Some computer-generated penalties were greater than when using EANx alone, particularly at partial pressures of oxygen (PO₂) below 1.40 bar. Some computers 'locked out' during the multi-dive series; others would continue to support decompression with, in some cases, automatically-reduced levels of conservatism. Changing reduced gradient bubble model values on Suunto computers produced few differences. DISCUSSION The range of possible adjustments and the non-standard computer response to them complicates the ability to provide accurate guidance to divers wanting to dive more conservatively. The use of EANx alone may not always generate satisfactory levels of conservatism.
Collapse
|
11
|
Ramsthaler F, Birngruber CG, Kröll AK, Kettner M, Verhoff MA. [True color accuracy in digital forensic photography]. ARCHIV FUR KRIMINOLOGIE 2016; 237:190-203. [PMID: 27386623] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Forensic photographs not only need to be unaltered and authentic and capture context-relevant images, along with certain minimum requirements for image sharpness and information density, but color accuracy also plays an important role, for instance, in the assessment of injuries or taphonomic stages, or in the identification and evaluation of traces from photos. The perception of color not only varies subjectively from person to person, but as a discrete property of an image, color in digital photos is also to a considerable extent influenced by technical factors such as lighting, acquisition settings, camera, and output medium (print, monitor). For these reasons, consistent color accuracy has so far been limited in digital photography. Because images usually contain a wealth of color information, especially for complex or composite colors or shades of color, and the wavelength-dependent sensitivity to factors such as light and shadow may vary between cameras, the usefulness of issuing general recommendations for camera capture settings is limited. Our results indicate that true image colors can best and most realistically be captured with the SpyderCheckr technical calibration tool for digital cameras tested in this study. Apart from aspects such as the simplicity and quickness of the calibration procedure, a further advantage of the tool is that the results are independent of the camera used and can also be used for the color management of output devices such as monitors and printers. The SpyderCheckr color-code patches allow true colors to be captured more realistically than with a manual white balance tool or an automatic flash. We therefore recommend that the use of a color management tool should be considered for the acquisition of all images that demand high true color accuracy (in particular in the setting of injury documentation).
Collapse
|
12
|
Liebregts J, Sonne M, Potvin JR. Photograph-based ergonomic evaluations using the Rapid Office Strain Assessment (ROSA). APPLIED ERGONOMICS 2016; 52:317-324. [PMID: 26360224 DOI: 10.1016/j.apergo.2015.07.028] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/29/2013] [Revised: 07/29/2015] [Accepted: 07/30/2015] [Indexed: 06/05/2023]
Abstract
The Rapid Office Strain Assessment (ROSA) was developed to assess musculoskeletal disorder (MSD) risk factors for computer workstations. This study examined the validity and reliability of remotely conducted, photo-based assessments using ROSA. Twenty-three office workstations were assessed on-site by an ergonomist, and 5 photos were obtained. Photo-based assessments were conducted by three ergonomists. The sensitivity and specificity of the photo-based assessors' ability to correctly classify workstations was 79% and 55%, respectively. The moderate specificity associated with false positive errors committed by the assessors could lead to unnecessary costs to the employer. Error between on-site and photo-based final scores was a considerable ∼2 points on the 10-point ROSA scale (RMSE = 2.3), with a moderate relationship (ρ = 0.33). Interrater reliability ranged from fairly good to excellent (ICC = 0.667-0.856) and was comparable to previous results. Sources of error include the parallax effect, poor estimations of small joint (e.g. hand/wrist) angles, and boundary errors in postural binning. While this method demonstrated potential validity, further improvements should be made with respect to photo-collection and other protocols for remotely-based ROSA assessments.
Collapse
|
13
|
Yu L, Zhang J, Yang Z. [Troubleshooting for Carestream GC1.5 Workstation]. ZHONGGUO YI LIAO QI XIE ZA ZHI = CHINESE JOURNAL OF MEDICAL INSTRUMENTATION 2015; 39:463-466. [PMID: 27066694] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
This paper is maintenance of four kinds of failures of Carestream GC1.5 the workstation used several years to summarize, workstation software, change the host, burn, workstations transmission, and four kinds of failures of the specific case of itemized elimination steps are introduced.
Collapse
|
14
|
Safe use of health information technology. SENTINEL EVENT ALERT 2015:1-6. [PMID: 25831561] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
|
15
|
Stepanova MI, Aleksandrova IE, Sazanyuk ZI, Voronova BZ, Lashneva LP, Shumkova TV, Berezina NO. [HYGIENIC REGULATION OF THE USE OF ELECTRONIC EDUCATIONAL RESOURCES IN THE MODERN SCHOOL]. GIGIENA I SANITARIIA 2015; 94:64-68. [PMID: 26856144] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
We studied the effect of academic studies with the use a notebook computer and interactive whiteboard on the functional state of an organism of schoolchildren. Using a complex of hygienic and physiological methods of the study we established that regulation of the computer activity of students must take into account not only duration but its intensity either. Design features of a notebook computer were shown both to impede keeping the optimal working posture in primary school children and increase the risk offormation of disorders of vision and musculoskeletal system. There were established the activating influence of the interactive whiteboard on performance activities and favorable dynamics of indices of the functional state of the organism of students under keeping optimal density of the academic study and the duration of its use. There are determined safety regulations of the work of schoolchildren with electronic resources in the educational process.
Collapse
|
16
|
Levanon Y, Lerman Y, Gefen A, Ratzon NZ. Validity of the modified RULA for computer workers and reliability of one observation compared to six. ERGONOMICS 2014; 57:1856-1863. [PMID: 25205040 DOI: 10.1080/00140139.2014.952350] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Awkward body posture while typing is associated with musculoskeletal disorders (MSDs). Valid rapid assessment of computer workers' body posture is essential for the prevention of MSD among this large population. This study aimed to examine the validity of the modified rapid upper limb assessment (mRULA) which adjusted the rapid upper limb assessment (RULA) for computer workers. Moreover, this study examines whether one observation during a working day is sufficient or more observations are needed. A total of 29 right-handed computer workers were recruited. RULA and mRULA were conducted. The observations were then repeated six times at one-hour intervals. A significant moderate correlation (r = 0.6 and r = 0.7 for mouse and keyboard, respectively) was found between the assessments. No significant differences were found between one observation and six observations per working day. The mRULA was found to be valid for the assessment of computer workers, and one observation was sufficient to assess the work-related risk factor.
Collapse
|
17
|
White H. Thinking customization? Proceed with caution. BEHAVIORAL HEALTHCARE 2013; 33:36-38. [PMID: 24298703] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
|
18
|
Landen R. High-tech precautions. FDA calls for controls against cyberattacks. MODERN HEALTHCARE 2013; 43:8-9. [PMID: 23875231] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
|
19
|
Szeto GPY, Wong TKT, Law RKY, Lee EWC, Lau T, So BCL, Law SW. The impact of a multifaceted ergonomic intervention program on promoting occupational health in community nurses. APPLIED ERGONOMICS 2013; 44:414-422. [PMID: 23153515 DOI: 10.1016/j.apergo.2012.10.004] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/05/2012] [Revised: 09/26/2012] [Accepted: 10/06/2012] [Indexed: 06/01/2023]
Abstract
INTRODUCTION Community nurses are exposed to high physical demands at work resulting in musculoskeletal disorders. The present study examined the short- and long-term benefits of a multifaceted intervention program designed especially for community nurses in Hong Kong. METHODS Fifty community nurses working in 4 local hospitals participated in the study. All of them underwent an 8-week intervention program consisting of ergonomic training, daily exercise program, equipment modification, computer workstation assessment and typing training. RESULTS All participants showed significant improvement in musculoskeletal symptoms and functional outcomes comparing pre- and post-intervention results. Significant reduction in symptom score was observed at 1-year follow-up compared to post-intervention. Symptomatic group (n=40) showed more significant changes overall compared to asymptomatic group (n=10). CONCLUSION Results support the positive benefits, both short- and long-term, of the multifaceted ergonomic intervention programme for community nurses.
Collapse
|
20
|
van Niekerk SM, Louw QA, Grimmer-Somers K, Harvey J, Hendry KJ. The anthropometric match between high school learners of the Cape Metropole area, Western Cape, South Africa and their computer workstation at school. APPLIED ERGONOMICS 2013; 44:366-371. [PMID: 23141959 DOI: 10.1016/j.apergo.2012.09.008] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/08/2011] [Revised: 09/19/2012] [Accepted: 09/21/2012] [Indexed: 06/01/2023]
Abstract
STUDY DESIGN Descriptive study. OBJECTIVE The objective of this study was to present anthropometric data from high school students in Cape Metropole area, Western Cape, South Africa that are relevant for chair design and whether the dimensions of computer laboratory chairs currently used in high schools match linear anthropometrics of high-school students. Summary of Background Data. Learner-chair mismatch is proposed as a cause of poor postural alignment and spinal pain in adolescents. A learner-chair mismatch is defined as the incompatibility between the dimensions of a chair and the anthropometric dimensions of the learner. Currently, there is no published research to ascertain whether the furniture dimensions in school computer laboratories match the anthropometrics of the students. This may contribute to the high prevalence of adolescent spinal pain. METHODS The sample consisted of 689 learners, 13-18 years old. The following body dimensions were measured: stature, popliteal height, buttock-to-popliteal length and hip width. These measurements were matched with the corresponding chair seat dimensions: height, depth and width. Popliteal and seat height mismatch was defined when the seat height is either >95% or <88% of the popliteal height. Buttock-popliteal length and seat depth mismatch was defined when the seat depth is either >95% or <80% of the buttock-popliteal length. Seat width mismatch is defined where the seat width should be at least 10% and at the most 30% larger than hip width. RESULTS An 89% of learners did not match the seat. Five percent of learners matched the chair depth, the majority was found to be too big. In contrast, 65% of the learners matched the chair width dimension. CONCLUSIONS A substantial mismatch was found. The school chairs failed standard ergonomics recommendations for the design of furniture to fit the user. This study supports the conclusion that there is no one-size-fits-all solution. There is an urgent need for chairs that are of different sizes or that are adjustable.
Collapse
|
21
|
Mahdiani HR, Fakhraie SM, Lucas C. Relaxed fault-tolerant hardware implementation of neural networks in the presence of multiple transient errors. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2012; 23:1215-1228. [PMID: 24807519 DOI: 10.1109/tnnls.2012.2199517] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Reliability should be identified as the most important challenge in future nano-scale very large scale integration (VLSI) implementation technologies for the development of complex integrated systems. Normally, fault tolerance (FT) in a conventional system is achieved by increasing its redundancy, which also implies higher implementation costs and lower performance that sometimes makes it even infeasible. In contrast to custom approaches, a new class of applications is categorized in this paper, which is inherently capable of absorbing some degrees of vulnerability and providing FT based on their natural properties. Neural networks are good indicators of imprecision-tolerant applications. We have also proposed a new class of FT techniques called relaxed fault-tolerant (RFT) techniques which are developed for VLSI implementation of imprecision-tolerant applications. The main advantage of RFT techniques with respect to traditional FT solutions is that they exploit inherent FT of different applications to reduce their implementation costs while improving their performance. To show the applicability as well as the efficiency of the RFT method, the experimental results for implementation of a face-recognition computationally intensive neural network and its corresponding RFT realization are presented in this paper. The results demonstrate promising higher performance of artificial neural network VLSI solutions for complex applications in faulty nano-scale implementation environments.
Collapse
|
22
|
Long J, Helland M. A multidisciplinary approach to solving computer related vision problems. Ophthalmic Physiol Opt 2012; 32:429-35. [PMID: 22540950 DOI: 10.1111/j.1475-1313.2012.00911.x] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
23
|
Raamanathan A, Simmons GW, Christodoulides N, Floriano PN, Furmaga WB, Redding SW, Lu KH, Bast RC, McDevitt JT. Programmable bio-nano-chip systems for serum CA125 quantification: toward ovarian cancer diagnostics at the point-of-care. Cancer Prev Res (Phila) 2012; 5:706-16. [PMID: 22490510 DOI: 10.1158/1940-6207.capr-11-0508] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
Point-of-care (POC) implementation of early detection and screening methodologies for ovarian cancer may enable improved survival rates through early intervention. Current laboratory-confined immunoanalyzers have long turnaround times and are often incompatible with multiplexing and POC implementation. Rapid, sensitive, and multiplexable POC diagnostic platforms compatible with promising early detection approaches for ovarian cancer are needed. To this end, we report the adaptation of the programmable bio-nano-chip (p-BNC), an integrated, microfluidic, and modular (programmable) platform for CA125 serum quantitation, a biomarker prominently implicated in multimodal and multimarker screening approaches. In the p-BNCs, CA125 from diseased sera (Bio) is sequestered and assessed with a fluorescence-based sandwich immunoassay, completed in the nano-nets (Nano) of sensitized agarose microbeads localized in individually addressable wells (Chip), housed in a microfluidic module, capable of integrating multiple sample, reagent and biowaste processing, and handling steps. Antibody pairs that bind to distinct epitopes on CA125 were screened. To permit efficient biomarker sequestration in a three-dimensional microfluidic environment, the p-BNC operating variables (incubation times, flow rates, and reagent concentrations) were tuned to deliver optimal analytical performance under 45 minutes. With short analysis times, competitive analytical performance (inter- and intra-assay precision of 1.2% and 1.9% and limit of detection of 1.0 U/mL) was achieved on this minisensor ensemble. Furthermore, validation with sera of patients with ovarian cancer (n = 20) showed excellent correlation (R(2) = 0.97) with gold-standard ELISA. Building on the integration capabilities of novel microfluidic systems programmed for ovarian cancer, the rapid, precise, and sensitive miniaturized p-BNC system shows strong promise for ovarian cancer diagnostics.
Collapse
|
24
|
Buitinga L, Braakman-Jansen LMA, Taal E, van de Laar MAFJ. A computer Time Trade-Off: a feasible and reliable alternative for the interview Time Trade-Off in rheumatoid arthritis. Clin Exp Rheumatol 2011; 29:783-789. [PMID: 21961923] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2010] [Accepted: 04/18/2011] [Indexed: 05/31/2023]
Abstract
OBJECTIVES The Time Trade-Off (TTO) is an instrument used for valuing health-related quality of life. This study evaluated the test-retest reliability of a computer TTO in rheumatoid arthritis (RA), and compared the computer with the interview TTO regarding feasibility and agreement. METHODS In study 1 using a cross-over design, thirty patients completed both TTOs. In study 2, twenty-nine other patients completed the computer TTO twice to examine test-retest reliability. Feasibility was measured by assessing actual and perceived time duration and general experience of the patient. Agreement between utility scores of both TTOs was measured by Bland-Altman analysis. RESULTS Both TTOs were feasible. The computer TTO showed high test-retest reliability (ICC = 0.88). Bland-Altman analysis showed a small mean difference (0.06, SD = 0.14, effect size=0.30) between both TTOs. Limits of agreement were wide (-0.22 to 0.34). Differences between interview and computer TTO utilities did not vary over the range of scores. CONCLUSIONS The computer TTO was feasible and reliable, but did not provide similar results as the interview TTO. However, no systematic biases in the differences were found over the range of scores.
Collapse
|
25
|
Wienold J, Recknagel S, Scharf H, Hoppe M, Michaelis M. Elemental analysis of printed circuit boards considering the ROHS regulations. WASTE MANAGEMENT (NEW YORK, N.Y.) 2011; 31:530-535. [PMID: 21050740 DOI: 10.1016/j.wasman.2010.10.002] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/12/2010] [Revised: 08/27/2010] [Accepted: 10/01/2010] [Indexed: 05/30/2023]
Abstract
The EU RoHS Directive (2002/95/EC of the European Parliament and of the Council) bans the placing of new electrical and electronic equipment containing more than agreed levels of lead, cadmium, mercury, hexavalent chromium, polybrominated biphenyl (PBB) and polybrominated diphenyl ether (PBDE) flame retardants on the EU market. It necessitates methods for the evaluation of RoHS compliance of assembled electronic equipment. In this study mounted printed circuit boards from personal computers were analyzed on their content of the three elements Cd, Pb and Hg which were limited by the EU RoHS directive. Main focus of the investigations was the influence of sample pre-treatment on the precision and reproducibility of the results. The sample preparation steps used were based on the guidelines given in EN 62321. Five different types of dissolution procedures were tested on different subsequent steps of sample treatment like cutting and milling. Elemental analysis was carried out using ICP-OES, XRF and CV-AFS (Hg). The results obtained showed that for decision-making with respect to RoHS compliance a size reduction of the material to be analyzed to particles ≤ 1.5mm can already be sufficient. However, to ensure analytical results with relative standard deviations of less than 20%, as recommended by the EN 62321, a much larger effort for sample processing towards smaller particle sizes might be required which strongly depends on the mass fraction of the element under investigation.
Collapse
|