101
|
Lewis T, Bhaganagar K. A comprehensive review of plume source detection using unmanned vehicles for environmental sensing. THE SCIENCE OF THE TOTAL ENVIRONMENT 2021; 762:144029. [PMID: 33385789 DOI: 10.1016/j.scitotenv.2020.144029] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 11/12/2020] [Accepted: 11/17/2020] [Indexed: 06/12/2023]
Abstract
Local meteorological conditions including wind speed and turbulence significantly influence the dispersion of pollutant plumes, introducing severe difficulties in predicting its trajectory, potential evacuation sites, and ultimately containment efforts. Ongoing developments in estimating rapid contaminant dispersion include the combined use of local meteorological data along with plume-source localization and identification via autonomous data-driven mobile-sensing robotic/vehicular platforms. With a vast number of available environmental-sensing mobile platforms, contaminant dispersion scenarios, and source-finding algorithms, selection of the ideal configuration for autonomous source localization involves a great deal of opportunity alongside uncertainty. This paper aims to review the significant developments of unmanned ground-based mobile sensing network configurations and autonomous data acquisition strategies commonly used for the task of gaseous plume source localization.
Collapse
Affiliation(s)
- Tyrell Lewis
- Laboratory of Turbulence, Sensing and Intelligence Systems, Department of Mechanical Engineering, University of Texas at San Antonio, United States of America
| | - Kiran Bhaganagar
- Laboratory of Turbulence, Sensing and Intelligence Systems, Department of Mechanical Engineering, University of Texas at San Antonio, United States of America.
| |
Collapse
|
102
|
K-Graph: Knowledgeable Graph for Text Documents. JOURNAL OF KONBIN 2021. [DOI: 10.2478/jok-2021-0006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Abstract
Graph databases are applied in many applications, including science and business, due to their low-complexity, low-overheads, and lower time-complexity. The graph-based storage offers the advantage of capturing the semantic and structural information rather than simply using the Bag-of-Words technique. An approach called Knowledgeable graphs (K-Graph) is proposed to capture semantic knowledge. Documents are stored using graph nodes. Thanks to weighted subgraphs, the frequent subgraphs are extracted and stored in the Fast Embedding Referral Table (FERT). The table is maintained at different levels according to the headings and subheadings of the documents. It reduces the memory overhead, retrieval, and access time of the subgraph needed. The authors propose an approach that will reduce the data redundancy to a larger extent. With real-world datasets, K-graph’s performance and power usage are threefold greater than the current methods. Ninety-nine per cent accuracy demonstrates the robustness of the proposed algorithm.
Collapse
|
103
|
Sharma S, Mehra R, Kumar S. Optimised CNN in conjunction with efficient pooling strategy for the multi‐classification of breast cancer. IET IMAGE PROCESSING 2021; 15:936-946. [DOI: 10.1049/ipr2.12074] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/30/2023]
Affiliation(s)
- Shallu Sharma
- Department of Electronics and Communication Engineering National Institute of Technical Teachers Training & Research Chandigarh India
| | - Rajesh Mehra
- Department of Electronics and Communication Engineering National Institute of Technical Teachers Training & Research Chandigarh India
| | - Sumit Kumar
- School of Electronics and Electrical Engineering Division of Communication Systems Lovely Professional University Jalandhar Punjab India
- Department of Electronics Engineering Indian Institute of Technology (Indian School of Mines) Dhanbad Jharkhand India
| |
Collapse
|
104
|
A discrete artificial bee colony algorithm for the distributed heterogeneous no-wait flowshop scheduling problem. Appl Soft Comput 2021. [DOI: 10.1016/j.asoc.2020.106946] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
105
|
An Integration of Neural Network and Shuffled Frog-Leaping Algorithm for CNC Machining Monitoring. FOUNDATIONS OF COMPUTING AND DECISION SCIENCES 2021. [DOI: 10.2478/fcds-2021-0003] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Abstract
This paper addresses Acoustic Emission (AE) from Computer Numerical Control (CNC) machining operations. Experimental measurements are performed on the CNC lathe sensors to provide the power consumption data. To this end, a hybrid methodology based on the integration of an Artificial Neural Network (ANN) and a Shuffled Frog-Leaping Algorithm (SFLA) is applied to the data resulting from these measurements for data fusion from the sensors which is called SFLA-ANN. The initial weights of ANN are selected using SFLA. The goal is to assess the potency of the signal periodic component among these sensors. The efficiency of the proposed SFLA-ANN method is analyzed compared to hybrid methodologies of Simulated Annealing (SA) algorithm and ANN (SA-ANN) and Genetic Algorithm (GA) and ANN (GA-ANN).
Collapse
|
106
|
Elastic Downsampling: An Adaptive Downsampling Technique to Preserve Image Quality. ELECTRONICS 2021. [DOI: 10.3390/electronics10040400] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
This paper presents a new adaptive downsampling technique called elastic downsampling, which enables high compression rates while preserving the image quality. Adaptive downsampling techniques are based on the idea that image tiles can use different sampling rates depending on the amount of information conveyed by each block. However, current approaches suffer from blocking effects and artifacts that hinder the user experience. To bridge this gap, elastic downsampling relies on a Perceptual Relevance analysis that assigns sampling rates to the corners of blocks. The novel metric used for this analysis is based on the luminance fluctuations of an image region. This allows a gradual transition of the sampling rate within tiles, both horizontally and vertically. As a result, the block artifacts are removed and fine details are preserved. Experimental results (using the Kodak and USC Miscelanea image datasets) show a PSNR improvement of up to 15 dB and a superior SSIM (Structural Similarity) when compared with other techniques. More importantly, the algorithms involved are computationally cheap, so it is feasible to implement them in low-cost devices. The proposed technique has been successfully implemented using graphics processors (GPU) and low-power embedded systems (Raspberry Pi) as target platforms.
Collapse
|
107
|
de Domingo M, Ortigosa N, Sevilla J, Roger S. Cluster-Based Relocation of Stations for Efficient Forest Fire Management in the Province of Valencia (Spain). SENSORS 2021; 21:s21030797. [PMID: 33504117 PMCID: PMC7865265 DOI: 10.3390/s21030797] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/29/2020] [Revised: 01/20/2021] [Accepted: 01/21/2021] [Indexed: 11/16/2022]
Abstract
Forest fires are undesirable situations with tremendous impacts on wildlife and people's lives. Reaching them quickly is essential to slowing down their expansion and putting them out in an effective manner. This work proposes an optimized distribution of fire stations in the province of Valencia (Spain) to minimize the impacts of forest fires. Using historical data about fires in the Valencia province, together with the location information about existing fire stations and municipalities, two different clustering techniques have been applied. Floyd-Warshall dynamic programming algorithm has been used to estimate the average times to reach fires among municipalities and fire stations in order to quantify the impacts of station relocation. The minimization was done approximately through k-means clustering. The outcomes with different numbers of clusters determined a predicted tradeoff between reducing the time and the cost of more stations. The results show that the proposed relocation of fire stations generally ensures faster arrival to the municipalities compared to the current disposition of fire stations. In addition, deployment costs associated with station relocation are also of paramount importance, so this factor was also taken into account in the proposed approach.
Collapse
Affiliation(s)
- Miguel de Domingo
- Computer Science Department, Universitat de València, Av. de la Universitat s/n, 46100 Burjassot, Spain; (N.O.); (J.S.)
- Correspondence: (M.d.D.); (S.R.)
| | - Nuria Ortigosa
- Computer Science Department, Universitat de València, Av. de la Universitat s/n, 46100 Burjassot, Spain; (N.O.); (J.S.)
- I.U. Matemática Pura y Aplicada, Universitat Politècnica de València, Camino de Vera s/n, 46022 València, Spain
| | - Javier Sevilla
- Computer Science Department, Universitat de València, Av. de la Universitat s/n, 46100 Burjassot, Spain; (N.O.); (J.S.)
| | - Sandra Roger
- Computer Science Department, Universitat de València, Av. de la Universitat s/n, 46100 Burjassot, Spain; (N.O.); (J.S.)
- Correspondence: (M.d.D.); (S.R.)
| |
Collapse
|
108
|
Liao Z, Pang X, Zhang J, Xiong B, Wang J. Blockchain on Security and Forensics Management in Edge Computing for IoT: A Comprehensive Survey. IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT 2021. [DOI: 10.1109/tnsm.2021.3122147] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
109
|
Lemenkova P. Mapping environmental and climate variations by GMT: A case of Zambia, Central Africa. ZEMLJISTE I BILJKA 2021. [DOI: 10.5937/zembilj2101117l] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
Abstract
Zambia recently experienced several environmental threats from climate change such as droughts, temperature rise and occasional flooding and they all affect agricultural sustainability and people wellbeing through negative effects on plants and growing crops. This paper is aimed at showing variations in several climate and environmental parameters in Zambia showing spatial variability and trends in different regions of Zambia's key environmental areas (Zambezi River and tributaries), Livingstone near the Victoria Falls and central region with Muchinga Mountains. A series of 10 maps was plotted using data from TerraClimate dataset: precipitation, soil moisture, Palmer Drought Severity Index (PDSI), downward surface shortwave radiation, vapor pressure deficit and anomalies, potential and actual evapotranspiration and wind speed with relation to the topographic distribution of elevations in Zambia plotted using GEBCO/SRTM data. The data range of the PDSI according to the index values ranged from minimum at -5.7 to the maximum at 16.6 and mean at 7.169, with standard deviation at 4.278. The PDSI is effective in quantifying drought in long-term period. Because PDSI index applies temperature data and water balance model, it indicates the effect of climate warming on drought by correlation with potential evapotranspiration. The maximum values for soil moisture of Zambia show minimum at 1 mm/m, maximum at 413 mm/m, mean at 173 mm/m. This study is technically based on using the Generic Mapping Tools (GMT) as cartographic scripting toolset. The paper contributes to the environmental monitoring of Zambia by presenting a series of climate and environmental maps that are beneficial for agricultural mapping of Zambia.
Collapse
|
110
|
Kudla M, Gutowska K, Synak J, Weber M, Bohnsack KS, Lukasiak P, Villmann T, Blazewicz J, Szachniuk M. Virxicon: A Lexicon Of Viral Sequences. Bioinformatics 2020; 36:5507-5513. [PMID: 33367605 PMCID: PMC8016492 DOI: 10.1093/bioinformatics/btaa1066] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Revised: 11/18/2020] [Accepted: 12/11/2020] [Indexed: 11/12/2022] Open
Abstract
Motivation Viruses are the most abundant biological entities and constitute a large reservoir of genetic diversity. In recent years, knowledge about them has increased significantly as a result of dynamic development in life sciences and rapid technological progress. This knowledge is scattered across various data repositories, making a comprehensive analysis of viral data difficult. Results In response to the need for gathering a comprehensive knowledge of viruses and viral sequences, we developed Virxicon, a lexicon of all experimentally acquired sequences for RNA and DNA viruses. The ability to quickly obtain data for entire viral groups, searching sequences by levels of taxonomic hierarchy—according to the Baltimore classification and ICTV taxonomy—and tracking the distribution of viral data and its growth over time are unique features of our database compared to the other tools. Availabilityand implementation Virxicon is a publicly available resource, updated weekly. It has an intuitive web interface and can be freely accessed at http://virxicon.cs.put.poznan.pl/.
Collapse
Affiliation(s)
- Mateusz Kudla
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan, 60-965, Poland.,Saxon Institute for Computational Intelligence and Machine Learning, University of Applied Sciences Mittweida, Mittweida, 09648, Germany
| | - Kaja Gutowska
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan, 60-965, Poland.,Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan, 61-704, Poland
| | - Jaroslaw Synak
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan, 60-965, Poland
| | - Mirko Weber
- Saxon Institute for Computational Intelligence and Machine Learning, University of Applied Sciences Mittweida, Mittweida, 09648, Germany
| | - Katrin Sophie Bohnsack
- Saxon Institute for Computational Intelligence and Machine Learning, University of Applied Sciences Mittweida, Mittweida, 09648, Germany
| | - Piotr Lukasiak
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan, 60-965, Poland.,Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan, 61-704, Poland
| | - Thomas Villmann
- Saxon Institute for Computational Intelligence and Machine Learning, University of Applied Sciences Mittweida, Mittweida, 09648, Germany
| | - Jacek Blazewicz
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan, 60-965, Poland.,Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan, 61-704, Poland
| | - Marta Szachniuk
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan, 60-965, Poland.,Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan, 61-704, Poland
| |
Collapse
|
111
|
Zych M. Education for business analysts in Poland. EDUCATION FOR INFORMATION 2020. [DOI: 10.3233/efi-200391] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The research aim is to investigate the relationship between skills required for a business analyst’s (BA) job and learning outcomes from selected Library and Information Science (LIS) degrees in Poland. Two hypotheses are stated: 1. Employers in Poland look for their future BAs among graduates with different degrees; 2. LIS-related degrees provide the core competences required for a BA job in Poland. An analysis of job offers for BAs in Poland was made, along with a comparative analysis of BA skills from the Standard Classification of Occupations, version 3 of the Guide to Business Analysis Body of Knowledge (BABOK 3), job offers and learning outcomes from Polish LIS-related degree courses. The most common requirements posted in job offers are language skills, knowledge of IT tools, communication, presentation and mediation skills, personality traits and analytical skills. Distinctive BA features such as understanding the nature of business analysis, requirements engineering, using notations and process frameworks, systems modelling and programming were found only in job offers and BABOK 3. Recommendations are made for LIS-related degree courses on how to deal with the mismatch between the skills required in the job market and those received through formal education.
Collapse
|
112
|
Classification of Categorical Data Based on the Chi-Square Dissimilarity and t-SNE. COMPUTATION 2020. [DOI: 10.3390/computation8040104] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
The recurrent use of databases with categorical variables in different applications demands new alternatives to identify relevant patterns. Classification is an interesting approach for the recognition of this type of data. However, there are a few amount of methods for this purpose in the literature. Also, those techniques are specifically focused only on kernels, having accuracy problems and high computational cost. For this reason, we propose an identification approach for categorical variables using conventional classifiers (LDC-QDC-KNN-SVM) and different mapping techniques to increase the separability of classes. Specifically, we map the initial features (categorical attributes) to another space, using the Chi-square (C-S) as a measure of dissimilarity. Then, we employ the (t-SNE) for reducing dimensionality of data to two or three features, allowing a significant reduction of computational times in learning methods. We evaluate the performance of proposed approach in terms of accuracy for several experimental configurations and public categorical datasets downloaded from the UCI repository, and we compare with relevant state of the art methods. Results show that C-S mapping and t-SNE considerably diminish the computational times in recognitions tasks, while the accuracy is preserved. Also, when we apply only the C-S mapping to the datasets, the separability of classes is enhanced, thus, the performance of learning algorithms is clearly increased.
Collapse
|
113
|
A New Application for the Goal Programming—The Target Decision Rule for Uncertain Problems. JOURNAL OF RISK AND FINANCIAL MANAGEMENT 2020. [DOI: 10.3390/jrfm13110280] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The goal programming (GP) is a well-known approach applied to multi-criteria decision making (M-DM). It has been used in many domains and the literature offers diverse extensions of this procedure. On the other hand, so far, some evident analogies between M-DM under certainty and scenario-based one-criterion decision making under uncertainty (1-DMU) have not been revealed in the literature. These similarities give the possibility to adjust the goal programming to an entirely new domain. The purpose of the paper is to create a novel method for uncertain problems on the basis of the GP ideas. In order to achieve this aim we carefully examine the analogies occurring between the structures of both issues (M-DM and 1-DMU). We also analyze some differences resulting from a different interpretation of the data. By analogy to the goal programming, four hybrids for 1-DMU are formulated. They differ from each other in terms of the type of the decision maker considered (pessimist, optimist, moderate). The new decision rule may be helpful when solving uncertain problems since it is especially designed for neutral criteria, which are not taken into account in existing procedures developed for 1-DMU.
Collapse
|
114
|
Sustainable Production–Inventory Model in Technical Cooperation on Investment to Reduce Carbon Emissions. Processes (Basel) 2020. [DOI: 10.3390/pr8111438] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Carbon cap-and-trade and carbon offsets are common and important carbon emission reduction policies in many countries. In addition, carbon emissions from business activities can be effectively reduced through specific capital investments in green technologies. Nevertheless, such capital investments are costly and not all enterprises can afford these investments. Therefore, if all members of a supply chain agree to share the investments in the facilities, the supply chain can reduce carbon emissions and generate more profit. Under carbon cap-and-trade and carbon tax policies, this study proposes a production–inventory model in which the buyer and vendor in the integrated supply chain agree to co-invest funds to reduce carbon emissions. We planned to integrate production, delivery, replenishment, and technology to reduce carbon emissions so as to maximize the total profit of the supply chain system. Several examples are simulated and the sensitivity analysis of the main parameters is carried out. The optimal solutions and joint total profit under various carbon emission policies are also compared. The future carbon emission control trend is expected to enable companies to share risks by co-investing and developing sustainable supply chains.
Collapse
|
115
|
As'ad R, Hariga M, Shamayleh A. Sustainable dynamic lot sizing models for cold products under carbon cap policy. COMPUTERS & INDUSTRIAL ENGINEERING 2020; 149:106800. [PMID: 32901170 PMCID: PMC7471773 DOI: 10.1016/j.cie.2020.106800] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/21/2019] [Revised: 07/07/2020] [Accepted: 08/29/2020] [Indexed: 06/11/2023]
Abstract
Amid the ever growing interest in operational supply chain models that incorporate environmental aspects as an integral part of the decision making process, this paper addresses the dynamic lot sizing problem of a cold product while accounting for carbon emissions generated during temperature-controlled storage and transportation activities. We present two mixed integer programming models to tackle the two cases where the carbon cap is imposed over the whole planning horizon versus the more stringent version of a cap per period. For the first model, a Lagrangian relaxation approach is proposed which provides a mean for comparing the operational cost and carbon footprint performance of the carbon tax and the carbon cap policies. Subsequently, a Bisection based algorithm is developed to solve the relaxed model and generate the optimal ordering policy. The second model, however, is solved via a dynamic programming based algorithm while respecting two established lower and upper bounds on the periodic carbon cap. The results of the computational experiments for the first model display a stepwise increase (decrease) in the total carbon emissions (operational cost) as the preset cap value is increased. A similar behavior is also observed for the second model with the exception that paradoxical increases in the total emissions are sometimes realized with slightly tighter values of the periodic cap.
Collapse
Affiliation(s)
- Rami As'ad
- Department of Industrial Engineering, College of Engineering, American University of Sharjah, P.O. Box 26666, Sharjah, United Arab Emirates
| | - Moncer Hariga
- Department of Industrial Engineering, College of Engineering, American University of Sharjah, P.O. Box 26666, Sharjah, United Arab Emirates
| | - Abdulrahim Shamayleh
- Department of Industrial Engineering, College of Engineering, American University of Sharjah, P.O. Box 26666, Sharjah, United Arab Emirates
| |
Collapse
|
116
|
An algorithm to elicitate ELECTRE II, III and IV parameters. DATA TECHNOLOGIES AND APPLICATIONS 2020. [DOI: 10.1108/dta-07-2020-0161] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
PurposeThis paper presents an algorithm that can elicitate all or any combination of parameters for the ELECTRE II, III or IV, methods. The algorithm takes some steps of a machine learning ensemble technique, the random forest, and for that, the authors named the approach as Ranking Trees Algorithm.Design/methodology/approachFirst, for a given method, the authors generate a set of ELECTRE models, where each model solves a random sample of criteria and actions (alternatives). Second, for each generated model, all actions are projected in a 1D space; in general, the best actions have higher values in a 1D space than the worst ones; therefore, they can be used to guide the genetic algorithm in the final step, the optimization phase. Finally, in the optimization phase, each model has its parameters optimized.FindingsThe results can be used in two different ways; the authors can merge all models, to find the elicitated parameters in this way, or the authors can ensemble the models, and the median of all ranks represents the final rank. The numerical examples achieved a Kendall Tau correlation rank over 0.85, and these results could perform as well as the results obtained by a group of specialists.Originality/valueFor the first time, the elicitation of ELECTRE parameters is made by an ensemble technique composed of a set of uncorrelated multicriteria models that can generate robust solutions.
Collapse
|
117
|
Gumna J, Zok T, Figurski K, Pachulska-Wieczorek K, Szachniuk M. RNAthor - fast, accurate normalization, visualization and statistical analysis of RNA probing data resolved by capillary electrophoresis. PLoS One 2020; 15:e0239287. [PMID: 33002005 PMCID: PMC7529196 DOI: 10.1371/journal.pone.0239287] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Accepted: 09/03/2020] [Indexed: 12/18/2022] Open
Abstract
RNAs adopt specific structures to perform their functions, which are critical to fundamental cellular processes. For decades, these structures have been determined and modeled with strong support from computational methods. Still, the accuracy of the latter ones depends on the availability of experimental data, for example, chemical probing information that can define pseudo-energy constraints for RNA folding algorithms. At the same time, diverse computational tools have been developed to facilitate analysis and visualization of data from RNA structure probing experiments followed by capillary electrophoresis or next-generation sequencing. RNAthor, a new software tool for the fully automated normalization of SHAPE and DMS probing data resolved by capillary electrophoresis, has recently joined this collection. RNAthor automatically identifies unreliable probing data. It normalizes the reactivity information to a uniform scale and uses it in the RNA secondary structure prediction. Our web server also provides tools for fast and easy RNA probing data visualization and statistical analysis that facilitates the comparison of multiple data sets. RNAthor is freely available at http://rnathor.cs.put.poznan.pl/.
Collapse
Affiliation(s)
- Julita Gumna
- Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan, Poland
| | - Tomasz Zok
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| | - Kacper Figurski
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| | | | - Marta Szachniuk
- Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan, Poland
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
- * E-mail: (KPW); (MS)
| |
Collapse
|
118
|
Lathamaheswari M, Nagarajan D, Kavikumar J, Broumi S. Triangular interval type-2 fuzzy soft set and its application. COMPLEX INTELL SYST 2020. [DOI: 10.1007/s40747-020-00151-6] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
AbstractDecision-making is an essential task in Science and Engineering. Since most of the real-world problems have uncertainty in nature, making the decision is challengeable one for the decision makers. Soft set has the advantage of free from the deficiency of the parameterization tools of existing theories, namely probability, fuzzy theory and the theory of rough sets. Linguistic terms mean different things to different people, so variability in expert's acceptance degree is possible. Here usage of type-1 fuzzy leads to noisy and uncertain, and the parameters also may be noisy and hence type-2 fuzzy sets may be used to address the mentioned issues. Therefore, a triangular interval type-2 fuzzy soft set has been considered in the present work by combining triangular interval type-2 fuzzy set and soft set. In this paper, a triangular interval type-2 fuzzy soft weighted arithmetic operator (TIT2FSWA) has been proposed with its desired mathematical properties; also applied the proposed methodology in a decision-making problem for profit analysis. Further comparative analysis has been made with the existing methods to show the effectiveness of the proposed method.
Collapse
|
119
|
Spatial Configuration of Abdominal Aortic Aneurysm Analysis as a Useful Tool for the Estimation of Stent-Graft Migration. Diagnostics (Basel) 2020; 10:diagnostics10100737. [PMID: 32977588 PMCID: PMC7598279 DOI: 10.3390/diagnostics10100737] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2020] [Revised: 09/11/2020] [Accepted: 09/21/2020] [Indexed: 11/17/2022] Open
Abstract
The aim of this study was to prepare a self-made mathematical algorithm for the estimation of risk of stent-graft migration with the use of data on abdominal aortic aneurysm (AAA) size and geometry of blood flow through aneurysm sac before or after stent-graft implantation. AngioCT data from 20 patients aged 50–60 years, before and after stent-graft placement in the AAA was analyzed. In order to estimate the risk of stent-graft migration for each patient we prepared an opposite spatial configuration of virtually reconstructed stent-graft with long body or short body. Thus, three groups of 3D geometries were analyzed: 20 geometries representing 3D models of aneurysm, 20 geometries representing 3D models of long body stent-grafts, and 20 geometries representing 3D models of short body stent-graft. The proposed self-made algorithm demonstrated its efficiency and usefulness in estimating wall shear stress (WSS) values. Comparison of the long or short type of stent-graft with AAA geometries allowed to analyze the implants’ spatial configuration. Our study indicated that short stent-graft, after placement in the AAA sac, generated lower drug forces compare to the long stent-graft. Each time shape factor was higher for short stent-graft compare to long stent-graft.
Collapse
|
120
|
Abstract
Abstract
We present an approach to mine cardinality restriction axioms from an existing knowledge graph, in order to extend an ontology describing the graph. We compare frequency estimation with kernel density estimation as approaches to obtain the cardinalities in restrictions. We also propose numerous strategies for filtering obtained axioms in order to make them more available for the ontology engineer. We report the results of experimental evaluation on DBpedia 2016-10 and show that using kernel density estimation to compute the cardinalities in cardinality restrictions yields more robust results that using frequency estimation. We also show that while filtering is of limited usability for minimum cardinality restrictions, it is much more important for maximum cardinality restrictions. The presented findings can be used to extend existing ontology engineering tools in order to support ontology construction and enable more efficient creation of knowledge-intensive artificial intelligence systems.
Collapse
|
121
|
Artificial Intelligence Research Community and Associations in Poland. FOUNDATIONS OF COMPUTING AND DECISION SCIENCES 2020. [DOI: 10.2478/fcds-2020-0009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Abstract
In last years Artificial Intelligence presented a tremendous progress by offering a variety of novel methods, tools and their spectacular applications. Besides showing scientific breakthroughs it attracted interest both of the general public and industry. It also opened heated debates on the impact of Artificial Intelligence on changing the economy and society. Having in mind this international landscape, in this short paper we discuss the Polish AI research community, some of its main achievements, opportunities and limitations. We put this discussion in the context of the current developments in the international AI community. Moreover, we refer to activities of Polish scientific associations and their initiative of founding Polish Alliance for the Development of Artificial Intelligence (PP-RAI). Finally two last editions of PP-RAI joint conferences are summarized.
Collapse
|
122
|
Popenda M, Miskiewicz J, Sarzynska J, Zok T, Szachniuk M. Topology-based classification of tetrads and quadruplex structures. Bioinformatics 2020; 36:1129-1134. [PMID: 31588513 PMCID: PMC7031778 DOI: 10.1093/bioinformatics/btz738] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2019] [Revised: 08/12/2019] [Accepted: 09/25/2019] [Indexed: 12/02/2022] Open
Abstract
Motivation Quadruplexes attract the attention of researchers from many fields of bio-science. Due to a specific structure, these tertiary motifs are involved in various biological processes. They are also promising therapeutic targets in many strategies of drug development, including anticancer and neurological disease treatment. The uniqueness and diversity of their forms cause that quadruplexes show great potential in novel biological applications. The existing approaches for quadruplex analysis are based on sequence or 3D structure features and address canonical motifs only. Results In our study, we analyzed tetrads and quadruplexes contained in nucleic acid molecules deposited in Protein Data Bank. Focusing on their secondary structure topology, we adjusted its graphical diagram and proposed new dot-bracket and arc representations. We defined the novel classification of these motifs. It can handle both canonical and non-canonical cases. Based on this new taxonomy, we implemented a method that automatically recognizes the types of tetrads and quadruplexes occurring as unimolecular structures. Finally, we conducted a statistical analysis of these motifs found in experimentally determined nucleic acid structures in relation to the new classification. Availability and implementation https://github.com/tzok/eltetrado/ Supplementary information Supplementary data are available at Bioinformatics online.
Collapse
Affiliation(s)
- Mariusz Popenda
- Department of Structural Bioinformatics, Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan 61-704, Poland
| | - Joanna Miskiewicz
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan 60-965, Poland
| | - Joanna Sarzynska
- Department of Structural Bioinformatics, Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan 61-704, Poland
| | - Tomasz Zok
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan 60-965, Poland.,Poznan Supercomputing and Networking Center, Poznan 61-139, Poland
| | - Marta Szachniuk
- Department of Structural Bioinformatics, Institute of Bioorganic Chemistry, Polish Academy of Sciences, Poznan 61-704, Poland.,Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Poznan 60-965, Poland
| |
Collapse
|
123
|
Ghosh A, Malla SR, Bhalla AS, Manchanda S, Kandasamy D, Kumar R. Texture analysis of routine T2 weighted fat-saturated images can identify head and neck paragangliomas - A pilot study. Eur J Radiol Open 2020; 7:100248. [PMID: 32984446 PMCID: PMC7498758 DOI: 10.1016/j.ejro.2020.100248] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2020] [Accepted: 08/14/2020] [Indexed: 01/11/2023] Open
Abstract
PURPOSE To evaluate the role of the first and second-order texture parameters obtained from T2-weighted fat-saturated DIXON images in differentiating paragangliomas from other neck masses, and to develop a statistical model to classify them. METHOD We retrospectively evaluated 38 paragangliomas, 18 nerve-sheath tumours and 14 other miscellaneous neck lesions obtained from an IRB approved study conducted between January 2016 and June 2019; using a composite gold standard of histopathology, cytology and DOTANOC PET CT (A total of 70 lesions in 63 patients). Fat-suppressed T2weighted-DIXON axial images were used. First and second-order texture-parameters were calculated from the original and filtered images. Feature selection using F-statistics and collinearity analysis provided 14 texture parameters for further analysis. Mann-Whitney-U test was used to compare between the groups and p-values were adjusted for multiple comparisons. ROC curve analysis was used to obtain optimal cut-offs. RESULTS A total of ten texture features were found to be significantly different between paragangliomas and non-paraganglioma lesions. Minimum from the histogram of grey levels was lower in paragangliomas with a cut off of ≤113.462 obtaining 62.9 % sensitivity and 77.27 % specificity in differentiating paragangliomas from non-paragangliomas. Logistic regression model was trained (n-49) using forward feature selection, which when evaluated on the validation set(n-21)- obtained an AUC of 0.855(95 %CI, 0.633 to 0.968) with a positive likelihood ratio of 4.545 (95 %CI, 1.298-15.923) in differentiating paragangliomas from non-paragangliomas. CONCLUSION Texture analysis of a routine imaging sequence can identify paragangliomas with high accuracy. Further development of texture analysis would enable better imaging workflow, resource utilisation and imaging cost reductions.
Collapse
Key Words
- AUC, area under the curve
- FDG-PET, fluorodeoxy-glucose positron emission tomography
- GLCM, grey level co-occurrence matrix
- Head neck
- ID, inverse difference
- IDM, inverse difference moment
- IDMN, inverse difference moment normalized
- IDN, inverse difference normalized
- IMC1, informational measure of correlation 1
- IMC2, informational measure of correlation 2
- LoG, laplacian of gaussian
- MCC, maximal correlation coefficient
- NST, nerve sheath tumour
- Nerve sheath tumour
- Paraganglioma
- ROC, receiver operator characteristics
- Radiomics
- Schwannoma
- Texture analysis
Collapse
Affiliation(s)
- Adarsh Ghosh
- Department of Radiodiagnosis, All India Institute of Medical Sciences, Ansari Nagar, New Delhi, 110029, India
| | - Soumya Ranjan Malla
- Department of Radiodiagnosis, All India Institute of Medical Sciences, Ansari Nagar, New Delhi, 110029, India
| | - Ashu Seith Bhalla
- Department of Radiodiagnosis, All India Institute of Medical Sciences, Ansari Nagar, New Delhi, 110029, India
| | - Smita Manchanda
- Department of Radiodiagnosis, All India Institute of Medical Sciences, Ansari Nagar, New Delhi, 110029, India
| | - Devasenathipathy Kandasamy
- Department of Radiodiagnosis, All India Institute of Medical Sciences, Ansari Nagar, New Delhi, 110029, India
| | - Rakesh Kumar
- Department of Otorhinolaryngology, Head & Neck Surgery, All India Institute of Medical Sciences, Ansari Nagar, New Delhi, 110029, India
| |
Collapse
|
124
|
Miskiewicz J, Sarzynska J, Szachniuk M. How bioinformatics resources work with G4 RNAs. Brief Bioinform 2020; 22:5902714. [PMID: 32898859 PMCID: PMC8138894 DOI: 10.1093/bib/bbaa201] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/25/2020] [Revised: 08/03/2020] [Accepted: 08/04/2020] [Indexed: 12/17/2022] Open
Abstract
Quadruplexes (G4s) are of interest, which increases with the number of identified G4 structures and knowledge about their biomedical potential. These unique motifs form in many organisms, including humans, where their appearance correlates with various diseases. Scientists store and analyze quadruplexes using recently developed bioinformatic tools—many of them focused on DNA structures. With an expanding collection of G4 RNAs, we check how existing tools deal with them. We review all available bioinformatics resources dedicated to quadruplexes and examine their usefulness in G4 RNA analysis. We distinguish the following subsets of resources: databases, tools to predict putative quadruplex sequences, tools to predict secondary structure with quadruplexes and tools to analyze and visualize quadruplex structures. We share the results obtained from processing specially created RNA datasets with these tools. Contact: mszachniuk@cs.put.poznan.pl Supplementary information: Supplementary data are available at Briefings in Bioinformatics online.
Collapse
Affiliation(s)
- Joanna Miskiewicz
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Piotrowo 2, 60-965 Poznan, Poland
| | - Joanna Sarzynska
- Institute of Bioorganic Chemistry, Polish Academy of Sciences, Noskowskiego 12/14, 61-704 Poznan, Poland
| | - Marta Szachniuk
- Institute of Computing Science and European Centre for Bioinformatics and Genomics, Poznan University of Technology, Piotrowo 2, 60-965 Poznan, Poland
| |
Collapse
|
125
|
A Multistage Sustainable Production–Inventory Model with Carbon Emission Reduction and Price-Dependent Demand under Stackelberg Game. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10144878] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
This paper investigated a multistage sustainable production–inventory model for deteriorating items (i.e., raw materials and finished goods) with price-dependent demand and collaborative carbon reduction technology investment under carbon tax regulation. The model was developed by first defining the total profit of the supply chain members under carbon tax regulation and, second, considering a manufacturer (leader)–retailer (follower) Stackelberg game. The optimal equilibrium solutions that maximize the manufacturer’s and retailer’s total profits were determined through the method analysis. An algorithm complemented the model to determine the optimal equilibrium solutions, which were then treated with sensitivity analyses for the major parameters. Based on the numerical analysis, (a) carbon tax policies help reduce carbon emissions for both the manufacturer and retailer; (b) most carbon emissions from supply chain operations negatively impact the total profits of both members; (c) the retailer may increase the optimal equilibrium selling price to respond to an increase in carbon emissions from supply chain operations or carbon tax; and (d) autonomous consumption positively affects both members’ optimal equilibrium policies and total profits, whereas induced consumption does the opposite. These findings are very managerial and instructive for companies seeking profits and fulfilling environmental responsibility and governments.
Collapse
|
126
|
Afzal Khan MN, Raheel Bhutta M, Hong KS. Effect of stimulation duration to the existence of initial dip. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2020:390-393. [PMID: 33018010 DOI: 10.1109/embc44109.2020.9175930] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
In this paper, we investigate the effect of stimulation durations on the hemodynamic responses (HRs) in the somatosensory cortex. In doing so, the relationship between stimulation duration and the initial dip is also investigated. The HRs are measured using functional near-infrared spectroscopy (fNIRS). The HR signals related to finger poking are acquired from the left somatosensory cortex. Two different stimulation durations (i.e., 1 and 5 sec) were tested in this study. From the results of the study, it is concluded that the stimulation duration of 1 sec (short stimulus) evokes initial dip in the somatosensory cortex, but it disappears as the stimulation duration gets longer. Therefore, the 1-sec stimulation duration can serve the purpose of the fNIRS-based brain-computer interface.
Collapse
|
127
|
Thomas JV, Abou Elkassem AM, Ganeshan B, Smith AD. MR Imaging Texture Analysis in the Abdomen and Pelvis. Magn Reson Imaging Clin N Am 2020; 28:447-456. [PMID: 32624161 DOI: 10.1016/j.mric.2020.03.009] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Add "which is a" before "distribution"? Texture analysis (TA) is a form of radiomics that refers to quantitative measurements of the histogram, distribution and/or relationship of pixel intensities or gray scales within a region of interest on an image. TA can be applied to MR images of the abdomen and pelvis, with the main strength quantitative analysis of pixel intensities and heterogeneity rather than subjective/qualitative analysis. There are multiple limitations of MRTA. Despite these limitations, there is a growing body of literature supporting MRTA. This review discusses application of MRTA to the abdomen and pelvis.
Collapse
Affiliation(s)
- John V Thomas
- Body Imaging Section, Department of Radiology, University of Alabama at Birmingham, N355 Jefferson Tower, 619 19th Street South, Birmingham, AL 35249-6830, USA.
| | - Asser M Abou Elkassem
- Department of Radiology, University of Alabama at Birmingham, 619 19th Street South, Birmingham, AL 35249-6830, USA
| | - Balaji Ganeshan
- Institute of Nuclear Medicine, University College of London, 5th Floor, Tower, 235 Euston Road, London NW1 2BU, UK
| | - Andrew D Smith
- Department of Radiology, University of Alabama at Birmingham, 619 19th Street South, Birmingham, AL 35249-6830, USA
| |
Collapse
|
128
|
Carrillo-Alarcón JC, Morales-Rosales LA, Rodríguez-Rángel H, Lobato-Báez M, Muñoz A, Algredo-Badillo I. A Metaheuristic Optimization Approach for Parameter Estimation in Arrhythmia Classification from Unbalanced Data. SENSORS (BASEL, SWITZERLAND) 2020; 20:s20113139. [PMID: 32498271 PMCID: PMC7308921 DOI: 10.3390/s20113139] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Revised: 05/22/2020] [Accepted: 05/29/2020] [Indexed: 06/11/2023]
Abstract
The electrocardiogram records the heart's electrical activity and generates a significant amount of data. The analysis of these data helps us to detect diseases and disorders via heart bio-signal abnormality classification. In unbalanced-data contexts, where the classes are not equally represented, the optimization and configuration of the classification models are highly complex, reflecting on the use of computational resources. Moreover, the performance of electrocardiogram classification depends on the approach and parameter estimation to generate the model with high accuracy, sensitivity, and precision. Previous works have proposed hybrid approaches and only a few implemented parameter optimization. Instead, they generally applied an empirical tuning of parameters at a data level or an algorithm level. Hence, a scheme, including metrics of sensitivity in a higher precision and accuracy scale, deserves special attention. In this article, a metaheuristic optimization approach for parameter estimations in arrhythmia classification from unbalanced data is presented. We selected an unbalanced subset of those databases to classify eight types of arrhythmia. It is important to highlight that we combined undersampling based on the clustering method (data level) and feature selection method (algorithmic level) to tackle the unbalanced class problem. To explore parameter estimation and improve the classification for our model, we compared two metaheuristic approaches based on differential evolution and particle swarm optimization. The final results showed an accuracy of 99.95%, a F1 score of 99.88%, a sensitivity of 99.87%, a precision of 99.89%, and a specificity of 99.99%, which are high, even in the presence of unbalanced data.
Collapse
Affiliation(s)
- Juan Carlos Carrillo-Alarcón
- Department of Computer Science, Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE), Tonantzintla, Puebla 72840, Mexico;
| | - Luis Alberto Morales-Rosales
- Faculty of Civil Engineering, Conacyt-Universidad Michoacana de San Nicolás de Hidalgo, Morelia 58030, Michoacán, Mexico;
| | | | | | - Antonio Muñoz
- Engineering Department, University of Guadalajara, Av. Independencia Nacional 151, Autlán, Jalisco 48900, Mexico;
| | - Ignacio Algredo-Badillo
- Department of Computer Science, Conacyt-Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE), Tonantzintla, Puebla 72840, Mexico
| |
Collapse
|
129
|
Shape and Enhancement Analysis as a Useful Tool for the Presentation of Blood Hemodynamic Properties in the Area of Aortic Dissection. J Clin Med 2020; 9:jcm9051330. [PMID: 32370301 PMCID: PMC7290319 DOI: 10.3390/jcm9051330] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Revised: 04/19/2020] [Accepted: 04/28/2020] [Indexed: 11/17/2022] Open
Abstract
The aim of this study was to create a mathematical approach for blood hemodynamic description with the use of brightness analysis. Medical data was collected from three male patients aged from 45 to 65 years with acute type IIIb aortic dissection that started proximal to the left subclavian artery and involved the renal arteries. For the recognition of wall dissection areas Digital Imaging and Communications in Medicine (DICOM) data were applied. The distance from descending aorta to the diaphragm was analyzed. Each time Feret (DF) and Hydraulic (DHy) diameter were calculated. Moreover, an average brightness (BAV) was analyzed. Finally, to describe blood hemodynamic in the area of aortic wall dissection, mathematical function combining difference in brightness value and diameter for each computed tomography (CT) scan was calculated. The results indicated that DF described common duct more accurately compare to DHy. While, DHy described more accurately true and false ducts. Each time when connection of true and false duct appeared, true duct had lower brightness compare to common duct and false duct. Moreover, false duct characterized with higher brightness compare to common duct. In summary, the proposed algorithm mimics changes in brightness value for patients with acute type IIIb aortic dissection.
Collapse
|
130
|
Quantitative Susceptibility Mapping and Vessel Wall Imaging as Screening Tools to Detect Microbleed in Sentinel Headache. J Clin Med 2020; 9:jcm9040979. [PMID: 32244737 PMCID: PMC7230854 DOI: 10.3390/jcm9040979] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2020] [Revised: 03/24/2020] [Accepted: 03/31/2020] [Indexed: 12/25/2022] Open
Abstract
Background: MR-quantitative susceptibility mapping (QSM) can identify microbleeds (MBs) in intracranial aneurysm (IA) wall associated with sentinel headache (SH) preceding subarachnoid hemorrhage. However, its use is limited, due to associated skull base bonny and air artifact. MR-vessel wall imaging (VWI) is not limited by such artifact and therefore could be an alternative to QSM. The purpose of this study was to investigate the correlation between QSM and VWI in detecting MBs and to help develop a diagnostic strategy for SH. Methods: We performed a prospective study of subjects with one or more unruptured IAs in our hospital. All subjects underwent evaluation using 3T-MRI for MR angiography (MRA), QSM, and pre- and post-contrast VWI of the IAs. Presence/absence of MBs detected by QSM was correlated with aneurysm wall enhancement (AWE) on VWI. Results: A total of 40 subjects harboring 51 unruptured IAs were enrolled in the study. MBs evident on the QSM sequence was detected in 12 (23.5%) IAs of 11 subjects. All these subjects had a history of severe headache suggestive of SH. AWE was detected in 22 (43.1%) IAs. Using positive QSM as a surrogate for MBs, the sensitivity, specificity, positive predictive value, and negative predictive value of AWE on VWI for detecting MBs were 91.7%, 71.8%, 50%, and 96.6%, respectively. Conclusions: Positive QSM findings strongly suggested the presence of MBs with SH, whereas, the lack of AWE on VWI can rule it out with a probability of 96.6%. If proven in a larger cohort, combining QSM and VWI could be an adjunctive tool to help diagnose SH, especially in cases with negative or non-diagnostic CT and lumbar puncture.
Collapse
|
131
|
Zindani D, Maity SR, Bhowmik S. Interval-valued intuitionistic fuzzy TODIM method based on Schweizer–Sklar power aggregation operators and their applications to group decision making. Soft comput 2020. [DOI: 10.1007/s00500-020-04783-1] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
132
|
Kim HG, Choi JW, Han M, Lee JH, Lee HS. Texture analysis of deep medullary veins on susceptibility-weighted imaging in infants: evaluating developmental and ischemic changes. Eur Radiol 2020; 30:2594-2603. [DOI: 10.1007/s00330-019-06618-6] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2019] [Revised: 11/07/2019] [Accepted: 12/11/2019] [Indexed: 12/28/2022]
|
133
|
Abstract
Computational ontologies are machine-processable structures which represent particular domains of interest. They integrate knowledge which can be used by humans or machines for decision making and problem solving. The main aim of this systematic review is to investigate the role of formal ontologies in information systems development, i.e., how these graphs-based structures can be beneficial during the analysis and design of the information systems. Specific online databases were used to identify studies focused on the interconnections between ontologies and systems engineering. One-hundred eighty-seven studies were found during the first phase of the investigation. Twenty-seven studies were examined after the elimination of duplicate and irrelevant documents. Mind mapping was substantially helpful in organising the basic ideas and in identifying five thematic groups that show the main roles of formal ontologies in information systems development. Formal ontologies are mainly used in the interoperability of information systems, human resource management, domain knowledge representation, the involvement of semantics in unified modelling language (UML)-based modelling, and the management of programming code and documentation. We explain the main ideas in the reviewed studies and suggest possible extensions to this research.
Collapse
|
134
|
Magnus M, Antczak M, Zok T, Wiedemann J, Lukasiak P, Cao Y, Bujnicki JM, Westhof E, Szachniuk M, Miao Z. RNA-Puzzles toolkit: a computational resource of RNA 3D structure benchmark datasets, structure manipulation, and evaluation tools. Nucleic Acids Res 2020; 48:576-588. [PMID: 31799609 PMCID: PMC7145511 DOI: 10.1093/nar/gkz1108] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2019] [Revised: 11/06/2019] [Accepted: 11/15/2019] [Indexed: 12/12/2022] Open
Abstract
Significant improvements have been made in the efficiency and accuracy of RNA 3D structure prediction methods during the succeeding challenges of RNA-Puzzles, a community-wide effort on the assessment of blind prediction of RNA tertiary structures. The RNA-Puzzles contest has shown, among others, that the development and validation of computational methods for RNA fold prediction strongly depend on the benchmark datasets and the structure comparison algorithms. Yet, there has been no systematic benchmark set or decoy structures available for the 3D structure prediction of RNA, hindering the standardization of comparative tests in the modeling of RNA structure. Furthermore, there has not been a unified set of tools that allows deep and complete RNA structure analysis, and at the same time, that is easy to use. Here, we present RNA-Puzzles toolkit, a computational resource including (i) decoy sets generated by different RNA 3D structure prediction methods (raw, for-evaluation and standardized datasets), (ii) 3D structure normalization, analysis, manipulation, visualization tools (RNA_format, RNA_normalizer, rna-tools) and (iii) 3D structure comparison metric tools (RNAQUA, MCQ4Structures). This resource provides a full list of computational tools as well as a standard RNA 3D structure prediction assessment protocol for the community.
Collapse
Affiliation(s)
- Marcin Magnus
- International Institute of Molecular and Cell Biology in Warsaw, 02-109 Warsaw, Poland
- ReMedy-International Research Agenda Unit, Centre of New Technologies, University of Warsaw, 02-097 Warsaw, Poland
| | - Maciej Antczak
- Institute of Computing Science & European Centre for Bioinformatics and Genomics, Poznan University of Technology, 60-965 Poznan, Poland
- Institute of Bioorganic Chemistry, Polish Academy of Sciences, 61-704 Poznan, Poland
| | - Tomasz Zok
- Institute of Computing Science & European Centre for Bioinformatics and Genomics, Poznan University of Technology, 60-965 Poznan, Poland
| | - Jakub Wiedemann
- Institute of Computing Science & European Centre for Bioinformatics and Genomics, Poznan University of Technology, 60-965 Poznan, Poland
- Institute of Bioorganic Chemistry, Polish Academy of Sciences, 61-704 Poznan, Poland
| | - Piotr Lukasiak
- Institute of Computing Science & European Centre for Bioinformatics and Genomics, Poznan University of Technology, 60-965 Poznan, Poland
- Institute of Bioorganic Chemistry, Polish Academy of Sciences, 61-704 Poznan, Poland
| | - Yang Cao
- Center of Growth, Metabolism and Aging, Key Laboratory of Bio-Resource and Eco-Environment of Ministry of Education, College of Life Sciences, Sichuan University, Chengdu 610065, PR China
| | - Janusz M Bujnicki
- International Institute of Molecular and Cell Biology in Warsaw, 02-109 Warsaw, Poland
- Institute of Molecular Biology and Biotechnology, Faculty of Biology, Adam Mickiewicz University, Poznan, Poland
| | - Eric Westhof
- Architecture et Réactivité de l’ARN, Université de Strasbourg, Institut de biologie moléculaire et cellulaire du CNRS, 12 allée Konrad Roentgen, 67084 Strasbourg, France
| | - Marta Szachniuk
- Institute of Computing Science & European Centre for Bioinformatics and Genomics, Poznan University of Technology, 60-965 Poznan, Poland
- Institute of Bioorganic Chemistry, Polish Academy of Sciences, 61-704 Poznan, Poland
| | - Zhichao Miao
- Translational Research Institute of Brain and Brain-Like Intelligence and Department of Anesthesiology, Shanghai Fourth People's Hospital Affiliated to Tongji University School of Medicine, Shanghai 200081, China
- European Molecular Biology Laboratory, European Bioinformatics Institute (EMBL-EBI), Wellcome Genome Campus, Cambridge CB10 1SD, UK
- Newcastle Fibrosis Research Group, Institute of Cellular Medicine, Faculty of Medical Sciences, Newcastle University, Newcastle upon Tyne, UK
| |
Collapse
|
135
|
Jureczko M, Nguyen NT, Szymczyk M, Unold O. Towards implementing defect prediction in the software development process. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2019. [DOI: 10.3233/jifs-179334] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Affiliation(s)
- Marian Jureczko
- Department of Computer Engineering, Wroclaw University of Science and Technology, Poland
| | - Ngoc Trung Nguyen
- Department of Computer Engineering, Wroclaw University of Science and Technology, Poland
| | - Marcin Szymczyk
- Department of Computer Engineering, Wroclaw University of Science and Technology, Poland
| | - Olgierd Unold
- Department of Computer Engineering, Wroclaw University of Science and Technology, Poland
| |
Collapse
|
136
|
Meng Q, Liu X, Song Y, Wang W. An extended generalized TODIM method for risk assessment of supply chain in social commerce under interval type-2 fuzzy environment. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2019. [DOI: 10.3233/jifs-190061] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Qiqi Meng
- School of Economics and Management, Southeast University, Nanjing, Jiangsu, China
| | - Xinwang Liu
- School of Economics and Management, Southeast University, Nanjing, Jiangsu, China
| | - Yu Song
- School of Computer Science and Engineering, Southeast University, Nanjing, Jiangsu, China
| | - Weizhong Wang
- School of Economics and Management, Southeast University, Nanjing, Jiangsu, China
| |
Collapse
|
137
|
Amin F, Fahmi A, Aslam M. Approaches to multiple attribute group decision making based on triangular cubic linguistic uncertain fuzzy aggregation operators. Soft comput 2019. [DOI: 10.1007/s00500-019-04614-y] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
138
|
Wasik S, Jaroszewski M, Nowaczyk M, Szostak N, Prejzendanc T, Blazewicz J. VirDB: Crowdsourced Database for Evaluation of Dynamical Viral Infection Models. Curr Bioinform 2019. [DOI: 10.2174/1574893614666190308155904] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Background:Open science is an emerging movement underlining the importance of transparent, high quality research where results can be verified and reused by others. However, one of the biggest problems in replicating experiments is the lack of access to the data used by the authors. This problem also occurs during mathematical modeling of a viral infections. It is a process that can provide valuable insights into viral activity or into a drug’s mechanism of action when conducted correctly.Objective:We present the VirDB database (virdb.cs.put.poznan.pl), which has two primary objectives. First, it is a tool that enables collecting data on viral infections that could be used to develop new dynamic models of infections using the FAIR data sharing principles. Second, it allows storing references to descriptions of viral infection models, together with their evaluation results.Methods:To facilitate the fast population of database and the ease of exchange of scientific data, we decided to use crowdsourcing for collecting data. Such approach has already been proved to be very successful in projects such as Wikipedia.Conclusion:VirDB builds on the concepts and recommendations of Open Science and shares data using the FAIR principles. Thanks to this storing data required for designing and evaluating models of viral infections which can be freely available on the Internet.
Collapse
Affiliation(s)
- Szymon Wasik
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| | - Marcin Jaroszewski
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| | - Mateusz Nowaczyk
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| | - Natalia Szostak
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| | - Tomasz Prejzendanc
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| | - Jacek Blazewicz
- Institute of Computing Science, Poznan University of Technology, Poznan, Poland
| |
Collapse
|
139
|
Abstract
OBJECTIVE. The purpose of this article is to review the nascent field of radiomics in cardiac MRI. CONCLUSION. Cardiac MRI produces a large number of images in a fairly inefficient manner with sometimes limited clinical application. In the era of precision medicine, there is increasing need for imaging to account for a broader array of diseases in an efficient and objective manner. Radiomics, the extraction and analysis of quantitative imaging features from medical imaging, may offer potential solutions to this need.
Collapse
|
140
|
Cloud Brokering with Bundles: Multi-objective Optimization of Services Selection. FOUNDATIONS OF COMPUTING AND DECISION SCIENCES 2019. [DOI: 10.2478/fcds-2019-0020] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Abstract
Cloud computing has become one of the major computing paradigms. Not only the number of offered cloud services has grown exponentially but also many different providers compete and propose very similar services. This situation should eventually be beneficial for the customers, but considering that these services slightly differ functionally and non-functionally -wise (e.g., performance, reliability, security), consumers may be confused and unable to make an optimal choice. The emergence of cloud service brokers addresses these issues. A broker gathers information about services from providers and about the needs and requirements of the customers, with the final goal of finding the best match.
In this paper, we formalize and study a novel problem that arises in the area of cloud brokering. In its simplest form, brokering is a trivial assignment problem, but in more complex and realistic cases this does not longer hold. The novelty of the presented problem lies in considering services which can be sold in bundles. Bundling is a common business practice, in which a set of services is sold together for the lower price than the sum of services’ prices that are included in it. This work introduces a multi-criteria optimization problem which could help customers to determine the best IT solutions according to several criteria. The Cloud Brokering with Bundles (CBB) models the different IT packages (or bundles) found on the market while minimizing (maximizing) different criteria. A proof of complexity is given for the single-objective case and experiments have been conducted with a special case of two criteria: the first one being the cost and the second is artificially generated. We also designed and developed a benchmark generator, which is based on real data gathered from 19 cloud providers. The problem is solved using an exact optimizer relying on a dichotomic search method. The results show that the dichotomic search can be successfully applied for small instances corresponding to typical cloud-brokering use cases and returns results in terms of seconds. For larger problem instances, solving times are not prohibitive, and solutions could be obtained for large, corporate clients in terms of minutes.
Collapse
|
141
|
Gou X, Liao H, Wang X, Xu Z, Herrera F. CONSENSUS BASED ON MULTIPLICATIVE CONSISTENT DOUBLE HIERARCHY LINGUISTIC PREFERENCES: VENTURE CAPITAL IN REAL ESTATE MARKET. INTERNATIONAL JOURNAL OF STRATEGIC PROPERTY MANAGEMENT 2019. [DOI: 10.3846/ijspm.2019.10431] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Based on the Computing with Words (CW), double hierarchy hesitant fuzzy linguistic term set (DHHFLTS) can be used to express complex linguistic information accurately with two simple linguistic hierarchies. This paper proposes a group decision making (GDM) model based on multiplicative consistency and consensus with double hierarchy hesitant fuzzy linguistic preference relations (DHHFLPRs). Firstly, a correlation coefficient of DHHFLTSs is defined based on the distance measures of double hierarchy hesitant fuzzy linguistic elements (DHHFLEs). Then, a multiplicative consistency property of DHHFLPRs is investigated, and a consistency checking method and a feedback mechanism-based repairing algorithm are developed to ensure all DHHFLPRs with acceptable multiplicative consistency. Furthermore, a correlation measure for DHHFLPRs based on the correlation coefficient of DHHFLTSs is proposed, and a new consensus reaching method on the basis of the correlation measure is developed, which can be used to fully obtain the consensus degree from both positive and negative angles. Finally, we make some comparative analyses with other existing consistency checking and repairing method as well as the consensus reaching approach to illustrate the effectiveness of the proposed method by a case study concerning the assessment of the venture capital project about real estate market in some cities of China.
Collapse
Affiliation(s)
- Xunjie Gou
- Business School, Sichuan University, Chengdu 610064, China; Andalusian Research Institute in Data Science and Computational Intelligence (DaSCI), University of Granada, 18071 Granada, Spain
| | - Huchang Liao
- Business School, Sichuan University, Chengdu 610064, China; Andalusian Research Institute in Data Science and Computational Intelligence (DaSCI), University of Granada, 18071 Granada, Spain
| | - Xinxin Wang
- Business School, Sichuan University, Chengdu 610064, China
| | - Zeshui Xu
- Business School, Sichuan University, Chengdu 610064, China
| | - Francisco Herrera
- Andalusian Research Institute in Data Science and Computational Intelligence (DaSCI), University of Granada, 18071 Granada, Spain; Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, Saudi Arabia
| |
Collapse
|
142
|
Abstract
Actual manufacturing enterprises usually solve the production blockage problem by increasing the public buffer. However, the increase of the public buffer makes the flexible flow shop scheduling rather challenging. In order to solve the flexible flow shop scheduling problem with public buffer (FFSP–PB), this study proposes a novel method combining the simulated annealing algorithm-based Hopfield neural network algorithm (SAA–HNN) and local scheduling rules. The SAA–HNN algorithm is used as the global optimization method, and constructs the energy function of FFSP–PB to apply its asymptotically stable characteristic. Due to the limitations, such as small search range and high probability of falling into local extremum, this algorithm introduces the simulated annealing algorithm idea such that the algorithm is able to accept poor fitness solution and further expand its search scope during asymptotic convergence. In the process of local scheduling, considering the transferring time of workpieces moving into and out of public buffer and the manufacturing state of workpieces in the production process, this study designed serval local scheduling rules to control the moving process of the workpieces between the public buffer and the limited buffer between the stages. These local scheduling rules can also be used to reduce the production blockage and improve the efficiency of the workpiece transfer. Evaluated by the groups of simulation schemes with the actual production data of one bus manufacturing enterprise, the proposed method outperforms other methods in terms of searching efficiency and optimization target.
Collapse
|
143
|
A Production Inventory Model for Deteriorating Items with Collaborative Preservation Technology Investment Under Carbon Tax. SUSTAINABILITY 2019. [DOI: 10.3390/su11185027] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The increase in carbon emissions is considered one of the major causes of global warming and climate change. To reduce the potential environmental and economic threat from such greenhouse gas emissions, governments must formulate policies related to carbon emissions. Most economists favor the carbon tax as an approach to reduce greenhouse gas emissions. This market-based approach is expected to inevitably affect enterprises’ operating activities such as production, inventory, and equipment investment. Therefore, in this study, we investigate a production inventory model for deteriorating items under a carbon tax policy and collaborative preservation technology investment from the perspective of supply chain integration. Our main purpose is to determine the optimal production, delivery, ordering, and investment policies for the buyer and vendor that maximize the joint total profit per unit time in consideration of the carbon tax policy. We present several numerical examples to demonstrate the solution procedures, and we conduct sensitivity analyses of the optimal solutions with respect to major parameters for identifying several managerial implications that provide a useful decision tool for the relevant managers. We hope that the study results assist government organizations in selecting a more appropriate carbon emissions policy for the carbon reduction trend.
Collapse
|
144
|
Gugaliya A, Boral S, Naikan V. A hybrid decision making framework for modified failure mode effects and criticality analysis. INTERNATIONAL JOURNAL OF QUALITY & RELIABILITY MANAGEMENT 2019. [DOI: 10.1108/ijqrm-08-2018-0213] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Purpose
Assessing the severity of failure modes of critical industrial machinery is often considered as an onerous task and sometimes misinterpreted by shop-floor engineer/maintenance personnel. The purpose of this paper is to develop an improved FMECA method for prioritizing the failure modes as per their risk levels and validating the same through a real case study of induction motors used in a process plant.
Design/methodology/approach
This paper presents a novel hybrid multi-criteria decision-making (MCDM) approach to prioritize different failure modes according to their risk levels by combining analytical hierarchy process (AHP) with a newly introduced MCDM approach, election based on relative value distance (ERVD). AHP is incorporated in the proposed approach to determine the criteria weights, evaluated in linguistic terms by industrial expert. Furthermore, ERVD, which is based on the concept of prospect theory of human cognitive process, is applied to rank the potential failure modes.
Findings
It is found that the proposed FMECA approach provides better results in accordance with the actual industrial scenario and helps in effectively prioritizing the failure modes. A comparison is also made to highlight the differences of results between the proposed approach with TOPSIS and conventional FMECA.
Research limitations/implications
This research paper proposes an improved FMECA method and, thus, provides a deep insight to maintenance managers for effectively prioritizing the failure modes. The correct prioritization of failure modes will help in effective maintenance planning, thus reducing the downtime and improving profit to the organization.
Practical implications
A real case of process plant induction motor has been introduced in the research paper to show the applicability of this decision-making approach, and the approach is found to be suitable in correct prioritization of the failure modes.
Originality/value
Severity has been decoupled into various factors affecting it, to make it more relevant as per actual industrial scenario. Then, a novel modified FMECA has been developed using a hybrid MCDM approach (AHP and ERVD). This hybrid method, as well as its application in FMECA, has not been developed by any previous researcher. Moreover, the same has been thoroughly explained by considering a real case of process plant induction motors and validated with cross-functional experts.
Collapse
|
145
|
Dou RL, Hu B, Shi WJ. Incremental Multi-Hop Localization Algorithm Based on Regularized Weighted Least Squares. INT J PATTERN RECOGN 2019. [DOI: 10.1142/s0218001419590328] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Incremental multi-hop localization algorithm applies to networks with broad range and low density of anchor nodes. However, during the localization process, it tends to be affected by accumulative errors and collinear problem between anchor nodes. We have proposed an incremental multi-hop localization algorithm based on regularized weighted least squares method, and the algorithm uses weighted least squares method to reduce the influence of accumulative errors and uses regularized method to weaken the collinear problem between anchor nodes. The results of both real experiment and simulative experiment show that compared to previous incremental multi-hop localization algorithms, the algorithm proposed in this paper can not only well solve the accumulated errors problem and obtain high localization accuracy, but it has also considered the influence of collinear problem on localization computation during the localization process. We evaluate our method based on various network scenes, and analyze its performance. We also compare our method with several existing methods, and demonstrate the high efficiency of our proposed method.
Collapse
Affiliation(s)
- Ru-Lin Dou
- School of Software Engineering, Jinling Institute of Technology, Nanjing, Jiangsu, P. R. China
| | - Bo Hu
- Center of Information Construction and Management, Nanjing Normal University of Special Education, Nanjing, Jiangsu, P. R. China
| | - Wei-Juan Shi
- Equipment Management Department, Jinling Institute of Technology, Nanjing, Jiangsu, P. R. China
| |
Collapse
|
146
|
Zhang DG, Cui YY, Zhang T. New quantum-genetic based OLSR protocol (QG-OLSR) for Mobile Ad hoc Network. Appl Soft Comput 2019. [DOI: 10.1016/j.asoc.2019.03.053] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
147
|
Recommendation Framework Combining User Interests with Fashion Trends in Apparel Online Shopping. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9132634] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Although fashion-related products account for most of the online shopping categories, it becomes more difficult for users to search and find products matching their taste and needs as the number of items available online increases explosively. Personalized recommendation of items is the best method for both reducing user effort on searching for items and expanding sales opportunity for sellers. Unfortunately, experimental studies and research on fashion item recommendation for online shopping users are lacking. In this paper, we propose a novel recommendation framework suitable for online apparel items. To overcome the rating sparsity problem of online apparel datasets, we derive implicit ratings from user log data and generate predicted ratings for item clusters by user-based collaborative filtering. The ratings are combined with a network constructed by an item click trend, which serves as a personalized recommendation through a random walk. An empirical evaluation on a large-scale real-world dataset obtained from an apparel retailer demonstrates the effectiveness of our method.
Collapse
|
148
|
Hirai Y, Mizukami T, Suzuki Y, Tsuji T, Watanabe T. Hierarchical Proximity Sensor for High-Speed and Intelligent Control of Robotic Hand. JOURNAL OF ROBOTICS AND MECHATRONICS 2019. [DOI: 10.20965/jrm.2019.p0453] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
This study proposes a hierarchical proximity information processing system and a novel proximity sensor for realizing high-speed and robust grasp control of a robotic hand. The sensor requires both fast response and advanced situation judgment abilities. Therefore, a function that digitally samples individual reaction amounts of all detection elements is added to the net-structure proximity sensor (NSPS), which extracts the sum and center position of the distribution of reaction amounts of detection elements by high-speed analog computation on the sensor circuit. To integrate these two functions, we construct a circuit design method that enables the coexistence of a multichannel A/D converter circuit on the analog computing circuit of the NSPS without disturbing the current flow for sensing; the proposed sensor is called the “hierarchical proximity sensor.” An analysis of its characteristics indicates that the sensor can be used for feedback control of the fingertip position/posture and to estimate the curvature of objects. Through an experiment conducted using a robotic hand equipped with the proposed sensor, we confirmed that the fingertip can approach an object in 0.18 s based on the high-speed analog computation information, while the information for improving the motion can be obtained by comparing the temporal change in the finger joints with the digital sampling information of the process.
Collapse
|
149
|
Tackling the Problem of Class Imbalance in Multi-class Sentiment Classification: An Experimental Study. FOUNDATIONS OF COMPUTING AND DECISION SCIENCES 2019. [DOI: 10.2478/fcds-2019-0009] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Abstract
Sentiment classification is an important task which gained extensive attention both in academia and in industry. Many issues related to this task such as handling of negation or of sarcastic utterances were analyzed and accordingly addressed in previous works. However, the issue of class imbalance which often compromises the prediction capabilities of learning algorithms was scarcely studied. In this work, we aim to bridge the gap between imbalanced learning and sentiment analysis. An experimental study including twelve imbalanced learning preprocessing methods, four feature representations, and a dozen of datasets, is carried out in order to analyze the usefulness of imbalanced learning methods for sentiment classification. Moreover, the data difficulty factors — commonly studied in imbalanced learning — are investigated on sentiment corpora to evaluate the impact of class imbalance.
Collapse
|
150
|
RNApolis: Computational Platform for RNA Structure Analysis. FOUNDATIONS OF COMPUTING AND DECISION SCIENCES 2019. [DOI: 10.2478/fcds-2019-0012] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Abstract
In the 1970s, computer scientists began to engage in research in the field of structural biology. The first structural databases, as well as models and methods supporting the analysis of biomolecule structures, started to be created. RNA was put at the centre of scientific interest quite late. However, more and more methods dedicated to this molecule are currently being developed. This paper presents RNApolis - a new computing platform, which offers access to seven bioinformatic tools developed to support the RNA structure study. The set of tools include a structural database and systems for predicting, modelling, annotating and evaluating the RNA structure. RNApolis supports research at different structural levels and allows the discovery, establishment, and validation of relationships between the primary, secondary and tertiary structure of RNAs. The platform is freely available at http://rnapolis.pl
Collapse
|