
تعداد نشریات | 26 |
تعداد شمارهها | 447 |
تعداد مقالات | 4,557 |
تعداد مشاهده مقاله | 5,380,005 |
تعداد دریافت فایل اصل مقاله | 3,580,078 |
A review on cost-based feature selection algorithms in the various applications of machine learning | ||
Journal of Mahani Mathematical Research | ||
مقالات آماده انتشار، پذیرفته شده، انتشار آنلاین از تاریخ 04 تیر 1404 اصل مقاله (1.65 M) | ||
نوع مقاله: Research Paper | ||
شناسه دیجیتال (DOI): 10.22103/jmmr.2025.24028.1696 | ||
نویسندگان | ||
Saba Beiranvand1؛ Mohammad Bagher Dowlatshahi* 2؛ Amin Hashemi2 | ||
1Department of Computer Engineering, National University of Skills (NUS), Tehran, Iran | ||
2Department of Computer Engineering, Lorestan University, Khorramabad, Iran | ||
چکیده | ||
Knowledge acquisition is the most important challenge in building an expert system in any field, and one of the sources of knowledge will be the data collected in that field. Traditionally, the data collection process is assumed to have a symmetric cost. For example, this assumption will not be acceptable in the medical due to various expenses. Designing a cost-sensitive classification and a cost-sensitive feature selection method are two approaches to considering cost factors. Cost-effective feature selection improves financial return by significantly saving feature data cost as well as limiting credit losses and this can be used in different areas, for example, computer imaging and medical diagnosis which also have a large number of features that may be irrelevant or redundant. Analysis of the research reviewed in this study shows that cost-sensitive feature selection focuses on selecting a feature subset with minimum total cost while achieving a classification accuracy that is as high as possible. The review of selected studies showed a downward trend in using heuristic methods in this field, Wrapper methods are in the first rank regarding usage in evaluation criteria, and 76\% of selected studies are in the single-objective category. Most of the studies were classified in the single-label category based on the number of determined labels. | ||
کلیدواژهها | ||
Cost-based approaches؛ Cost-sensitive classification؛ Feature selection, single-label data | ||
مراجع | ||
[1] Abdulla, M., & Khasawneh, M. T. (2020). G-Forest: An ensemble method for costsensitive feature selection in gene expression microarrays. Arti cial Intelligence in Medicine, 108, 101941.
[2] Ahajjam, A., Allgaier, M., Chance, R., Chukwuemeka, E., Putkonen, J., & Pasch, T. (2025). Enhancing prediction of wild re occurrence and behavior in Alaska using spatiotemporal clustering and ensemble machine learning. Ecological Informatics, 85, 102963.
[3] Akyon, F. C.,& Kalfaoglu, M. E. (2019). Instagram fake and automated account detection. 2019 Innovations in Intelligent Systems and Applications Conference (ASYU), 1{7.
[4] Al-Ahmari, S., & Nadeem, F. (2025). Improving Surgical Site Infection Prediction Using Machine Learning: Addressing Challenges of Highly Imbalanced Data. Diagnostics, 15(4), 501.
[5] Al-Tashi, Q., Abdulkadir, S. J., Rais, H. M., Mirjalili, S.,& Alhussian, H. (2020). Approaches to multi-objective feature selection: a systematic literature review. IEEE Access, 8, 125076{125096.
[6] Ali, S. I., Bilal, H. S. M., Hussain, M., Hussain, J., Satti, F. A., Hussain, M., Park, G. H., Chung, T., & Lee, S. (2020). Ensemble feature ranking for cost-based non-overlapping groups: A case study of chronic kidney disease diagnosis in developing countries. IEEE Access, 8, 215623{215648.
[7] Ali, S. I., Khan, W. A., Lee, S., & Lee, S.-H. (2020). Cost-Sensitive Feature Selection using Particle Swarm Optimization: A Post-Processing Approach. 2020 International Conference on Information Networking (ICOIN), 97{101.
[8] An, C., & Zhou, Q. (2019). A cost-sensitive feature selection method for highdimensional data. 2019 14th International Conference on Computer Science & Education (ICCSE), 1089{1094.
[8] Ang, J. C., Mirzal, A., Haron, H., & Hamed, H. N. A. (2015). Supervised, unsupervised, and semi-supervised feature selection: a review on gene selection. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 13(5), 971{989.
[9] Asharaf, L. A., & Vijayan, V. (2015). Cost-Sensitive Boosting Networks for Data Defect Prediction. International Journal of Science and Research (IJSR), 4.
[10] Aydogan, E. K., Ozmen, M., & Delice, Y. (2016). Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers. International Journal of Economics and Management Engineering, 10(4), 1317{1321.
[11] Bach, M., & Werner, A. (2018). Cost-sensitive feature selection for class imbalance problem. Information Systems Architecture and Technology: Proceedings of 38th International Conference on Information Systems Architecture and Technology{ISAT 2017: Part I, 182{194.
[12] Barushka, A., & Hajek, P. (2020). Spam detection on social networks using cost-sensitive feature selection and ensemble-based regularized deep neural networks. Neural Computing and Applications, 32(9), 4239{4257.
[13] Bayati, H., Dowlatshahi, M. B., & Hashemi, A. (2022). MSSL: a memetic-based sparse subspace learning algorithm for multi-label classi cation. International Journal of Machine Learning and Cybernetics, 13(11), 3607{3624.
[14] Beiranvand, S., & Chahooki, Z. (2016). Bridging the semantic gap for software e ort estimation by hierarchical feature selection techniques. Journal of AI and Data Mining, 4(2), 157{168.
[15] Beiranvand, S., & Zare Chahooki, M. A. (2023). Accuracy Improvement in Software Cost Estimation based on Selection of Relevant Features of Homogeneous Clusters. Journal of AI and Data Mining, 11(3), 453{476.
[16] Ben-Pe~na, S., Blanquero, R., Carrizosa, E., & Ram-Cobo, P. (2019). Cost-sensitive feature selection for support vector machines. Computers & Operations Research, 106, 169{178.
[17] BenPe~na, S. (2021). New models and methods for classi cation and feature selection. a mathematical optimization perspective.
[18] Bhuyan, H. K., & Chakraborty, C. (2022). Explainable machine learning for data extraction across computational social system. IEEE Transactions on Computational Social Systems.
[19] Bian, J., Peng, X., Wang, Y., & Zhang, H. (2016). An Ecient Cost-Sensitive Feature Selection Using Chaos Genetic Algorithm for Class Imbalance Problem. Mathematical Problems in Engineering, 2016(1), 8752181.
[20] Bolon-Canedo, V. (2014). Novel feature selection methods for high dimensional data.
[21] Bolon-Canedo, V., & Alonso-Betanzos, A. (2019). Ensembles for feature selection: A review and future trends. Information Fusion, 52, 1{12.
[22] Bolon-Canedo, V., Porto-D, I., Sanchez-Maro~no, N., & Alonso-Betanzos, A. (2014). A framework for cost-based feature selection. Pattern Recognition, 47(7), 2481{2489.
[23] Bolon-Canedo, V., Remeseiro, B., Sanchez-Marono, N., & Alonso-Betanzos, A. (2014). mC-ReliefF-An Extension of ReliefF for Cost-based Feature Selection. International Conference on Agents and Arti cial Intelligence, 2, 42{51.
[24] Bolon-Canedo, V., Remeseiro, B., Sanchez-Maro~no, N., & Alonso-Betanzos, A. (2015). Real-time tear lm classi cation through cost-based feature selection. Transactions on Computational Collective Intelligence XX, 78{98.
[25] Botes, F. H., Leenen, L., & de La Harpe, R. (2017). Ant tree miner amyntas: Automatic, cost-based feature selection for intrusion detection. Journal of Information Warfare, 16(4), 73{92.
[26] Buyukkececi, M., & Okur, M. C. (2022). A comprehensive review of feature selection and feature selection stability in machine learning. Gazi University Journal of Science, 36(4), 1506{1520.
[27] Casella, E. (2023). Machine-Learning-Powered Cyber-Physical Systems.
[28] Casella, E., Cantor, M. C., Setser, M. M. W., Silvestri, S., & Costa, J. H. C. (2023). A machine learning and optimization framework for the early diagnosis of bovine respiratory disease. IEEE Access.
[29] Casella, E., Cantor, M. C., Silvestri, S., Renaud, D. L., & Costa, J. H. C. (2022). Cost-aware inference of bovine respiratory disease in calves using precision livestock technology. 2022 18th International Conference on Distributed Computing in Sensor Systems (DCOSS), 109{116.
[30] Chakraborty, B., Divakaran, D. M., Nevat, I., Peters, G. W., & Gurusamy, M. (2021). Cost-aware feature selection for IoT device classi cation. IEEE Internet of Things Journal, 8(14), 11052{11064.
[31] Chang, Y., Kim, N., Lee, Y., Lim, J., Seo, J. B., & Lee, Y. K. (2012). Fast and ecient lung disease classi cation using hierarchical one-against-all support vector machine and cost-sensitive feature selection. Computers in Biology and Medicine, 42(12), 1157{1164. [32] Chen, C. M., Tso, G. K. F., & He, K. (2024). Quantum Optimized Cost Based Feature Selection and Credit Scoring for Mobile Micro- nancing. Computational Economics, 63(2), 919{950. [33] Chen, Y., Wang, Y., Cao, L., & Jin, Q. (2018). An e ective feature selection scheme for healthcare data classi cation using binary particle swarm optimization. 2018 9th International Conference on Information Technology in Medicine and Education (ITME), 703{707. [34] Ciupke, K. (2006). Cost-sensitive feature selection. Diagnostyka, 2 (38), 45{48. [35] Cui, H., Zhang, L., Yang, H., Wang, J., & Liu, Z. (2024). Maximizing the lender's pro t: pro t-oriented loan default prediction based on a weighting model. Annals of Operations Research, 1{34. [36] Dalvand, A., Dowlatshahi, M. B., & Hashemi, A. (2022). SGFS: a semi-supervised graph-based feature selection algorithm based on the PageRank algorithm. 2022 27th International Computer Conference, Computer Society of Iran (CSICC), 1{6. [37] Das, S., Iyer, R., & Natarajan, S. (2020). Cost Aware Feature Elicitation. KiML 2020, 3. [38] Das, S., Iyer, R., & Natarajan, S. (2021). A clustering based selection framework for cost aware and test-time feature elicitation. Proceedings of the 3rd ACM India Joint International Conference on Data Science & Management of Data (8th ACM IKDD CODS & 26th COMAD), 20{28. [39] De Bock, K. W., Coussement, K., & Lessmann, S. (2020). Cost-sensitive business failure prediction when misclassi cation costs are uncertain: A heterogeneous ensemble selection approach. European Journal of Operational Research, 285(2), 612{630. [40] Devers Cantero, J. (2024). Trend analysis in machine learning. Universitat Politecnica de Catalunya. [41] Dharmalingam, V., & Kumar, D. (2022). Hybrid feature selection model for classi cation of lung disorders. Journal of Ambient Intelligence and Humanized Computing, 13(12), 5609{5625. [42] Early, K., Fienberg, S. E., & Manko , J. (2016). Cost-E ective Feature Selection and Ordering for Personalized Energy Estimates. AAAI Workshop: AI for Smart Grids and Smart Buildings. [43] Fajri, Y. A. Z. A., Suryani, E., & Setyawan, S. (2023). Use of Arti cial Bee Swarm Optimization (ABSO) for Feature Selection in System Diagnosis for Coronary Heart Disease. [44] Fang, Y., Liu, Z.-H., & Min, F. (2016). Multi-objective cost-sensitive attribute reduction on data with error ranges. International Journal of Machine Learning and Cybernetics, 7, 783{793. [45] Fang, Y., Liu, Z.-H., & Min, F. (2017). A PSO algorithm for multi-objective costsensitive attribute reduction on numeric data with error ranges. Soft Computing, 21(23), 7173{7189. [46] Feng, F., Li, K.-C., Shen, J., Zhou, Q., & Yang, X. (2020). Using cost-sensitive learning and feature selection algorithms to improve the performance of imbalanced classi cation. IEEE Access, 8, 69979{69996. [47] Gan, J., Li, J., & Xie, Y. (2022). Robust SVM for cost-sensitive learning. Neural Processing Letters, 1{22. [48] Gresser, M., Brehl, D., Rabe, M. D., & Baker, B. (2021). Optimization of the Design of Diagnostic Systems through requirement-based Feature Selection. International Journal of Electronics and Electrical Engineering, 9(2), 26{35. [49] Hancer, E., Xue, B., & Zhang, M. (2020). A survey on feature selection approaches for clustering. Arti cial Intelligence Review, 53(6), 4519{4545. [50] Hashemi, A., Dowlatshahi, M. B., & Nazamabadi-pour, H. (2023). Minimum redundancy maximum relevance ensemble feature selection: A bi-objective Pareto-based approach. Journal of Soft Computing and Information Technology (JSCIT) Vol, 12(1). [51] Hashemi, A., Dowlatshahi, M. B., & Nezamabadi-pour, H. (2021a). A pareto-based ensemble of feature selection algorithms. Expert Systems with Applications, 180, 115130. [52] Hashemi, A., Dowlatshahi, M. B., & Nezamabadi-pour, H. (2021b). An ecient Paretobased feature selection algorithm for multi-label classi cation. Information Sciences, 581, 428{447. [53] Hashemi, A., Dowlatshahi, M. B., & Nezamabadi-pour, H. (2021c). VMFS: A VIKORbased multi-target feature selection. Expert Systems with Applications, 182, 115224. [54] Hashemi, A., Dowlatshahi, M. B., & Nezamabadi-pour, H. (2022). Ensemble of feature selection algorithms: a multi-criteria decision-making approach. International Journal of Machine Learning and Cybernetics, 13(1), 49{69. [55] Hashemi, A., Dowlatshahi, M. B., & Nezamabadi-Pour, H. (2020a). MFS-MCDM: Multilabel feature selection using multi-criteria decision making. Knowledge-Based Systems, 206, 106365. [56] Hashemi, A., Dowlatshahi, M. B., & Nezamabadi-Pour, H. (2020b). MGFS: A multilabel graph-based feature selection algorithm via PageRank centrality. Expert Systems with Applications, 142, 113024. [57] Hashemi, A., Dowlatshahi, M. B., & Nezamabadi-Pour, H. (2021). A bipartite matchingbased feature selection for multi-label learning. International Journal of Machine Learning and Cybernetics, 12, 459{475. [58] Hashemi, A., Joodaki, M., Joodaki, N. Z., & Dowlatshahi, M. B. (2022). Ant colony optimization equipped with an ensemble of heuristics through multi-criteria decision making: A case study in ensemble feature selection. Applied Soft Computing, 124, 109046. [59] Hashemi, A., Pajoohan, M.-R., & Dowlatshahi, M. B. (2022). Online streaming feature selection based on Sugeno fuzzy integral. 2022 9th Iranian Joint Congress on Fuzzy and Intelligent Systems (CFIS), 1{6. [60] Huang, C.-W., Chou, C.-K., & Chen, M.-S. (2018). A salient ensemble of trees using cascaded linear classi ers with feature-cost constraints. Proceedings of the 2018 SIAM International Conference on Data Mining, 486{494. [61] Huang, J., Qian, W., Wu, B., &Wang, Y. (2019). Cost-Sensitive Feature Selection Based on Label Signi cance and Positive Region. 2019 International Conference on Machine Learning and Cybernetics (ICMLC), 1{7. [62] Huang, K., Shi, Y., Ding, D., Li, Y., Fei, Y., Lakshmanan, L., Xiao, X. (2025). ThriftLLM: On Cost-E ective Selection of Large Language Models for Classi cation Queries. ArXiv Preprint ArXiv:2501.04901. [63] Huang, M., Ye, X., Imakura, A., & Sakurai, T. (2022). Sequential reinforcement active feature learning for gene signature identi cation in renal cell carcinoma. Journal of Biomedical Informatics, 128, 104049. [64] Huang, Q., Qian, W., Shu, W., Wu, B., & Feng, S. (2018). Multi-label cost-sensitive feature selection algorithm in incomplete data. 2018 International Conference on Machine Learning and Cybernetics (ICMLC), 1, 56{62. [65] Imran Ali, S., Ali, B., Hussain, J., Hussain, M., Satti, F. A., Park, G. H., Lee, S. (2020). Cost-sensitive ensemble feature ranking and automatic threshold selection for chronic kidney disease diagnosis. Applied Sciences, 10(16), 5663. [66] Jagdhuber, R., Lang, M., & Rahnenfuhrer, J. (2020). Feature Selection Methods for Cost-Constrained Classi cation in Random Forests. ArXiv Preprint ArXiv:2008.06298. [67] Jagdhuber, R., Lang, M., Stenzl, A., Neuhaus, J., & Rahnenfuhrer, J. (2020). Cost- Constrained feature selection in binary classi cation: adaptations for greedy forward selection and genetic algorithms. BMC Bioinformatics, 21, 1{21. [68] Jagdhuber, R., & Rahnenfuhrer, J. (2021). Implications on Feature Detection When Using the Bene t{Cost Ratio. SN Computer Science, 2(4), 316. [69] Janisch, J., Pevny, T., & Lisy, V. (2024). Classi cation with costly features in hierarchical deep sets. Machine Learning, 1{36. [70] Javanmardi, S. (2011). Measuring content quality in user generated content systems: a machine learning approach. University of California, Irvine. [71] Ji, S., & Carin, L. (2007). Cost-sensitive feature acquisition and classi cation. Pattern Recognition, 40(5), 1474{1485. [72] Jiang, L., Kong, G., & Li, C. (2019). Wrapper framework for test-cost-sensitive feature selection. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 51(3), 1747{1756. [73] Joshua, A. (2013). Improving Software Quality Using Two Stage Cost Sensitive Learning. Proc. of International Conf. on Science and Research, 1726{1728. [74] Kachuee, M., Darabi, S., Moatamed, B., & Sarrafzadeh, M. (2018). Dynamic feature acquisition using denoising autoencoders. IEEE Transactions on Neural Networks and Learning Systems, 30(8), 2252{2262. [75] Klonecki, T., & Teisseyre, P. (2023). Feature selection under budget constraint in medical applications: analysis of penalized empirical risk minimization methods. Applied Intelligence, 53(24), 29943{29973. [76] Klonecki, T., & Teisseyre Pawe land Lee, J. (2024). Cost-constrained multi-label group feature selection using shadow features. ArXiv Preprint ArXiv:2408.01851. [77] Knauer, R., & Rodner, E. (2023). Cost-Sensitive Best Subset Selection for Logistic Regression: A Mixed-Integer Conic Optimization Perspective. German Conference on Arti cial Intelligence (Kunstliche Intelligenz), 114{129. [78] le Roux, S. P., Wolhuter, R., & Niesler, T. (2018). Energy-aware feature and model selection for onboard behavior classi cation in low-power animal borne sensor applications. IEEE Sensors Journal, 19(7), 2722{2734. [79] Lee, I. G., Zhang, Q., Yoon, S. W., & Won, D. (2020). A mixed integer linear programming support vector machine for cost-e ective feature selection. Knowledge-Based Systems, 203, 106145. [80] Levering, R., & Cutler, M. (2009). Cost-sensitive feature extraction and selection in genre classi cation. Journal for Language Technology and Computational Linguistics, 24(1), 57{72. [81] Li, J.-K., Zhao, H., & Zhu, W. (2016). Feature Selection with Multi-Cost Constraint. , 17(5), 981{991. [82] Li, J., Zhao, H., & Zhu, W. (2015). Fast randomized algorithm with restart strategy for minimal test cost feature selection. International Journal of Machine Learning and Cybernetics, 6, 435{442. [83] Li, X., Zhao, H., & Zhu, W. (2016). An exponent weighted algorithm for minimal cost feature selection. International Journal of Machine Learning and Cybernetics, 7, 689{698. [84] Li, Y., Ma, C., Tao, Y., Hu, Z., Su, Z., & Liu, M. (2022). A robust cost-sensitive feature selection via self-paced learning regularization. Neural Processing Letters, 54(4), 2571{2588. [85] Li, Z., Cheng, P., Yin, L., & Guan, Y. (2025). A Recursive Attribute Reduction Algorithm and Its Application in Predicting the Hot Metal Silicon Content in Blast Furnaces. Big Data and Cognitive Computing, 9(1), 6. [86] Liang, Y. (2024). Cost-Sensitive Feature Selection Based on Adaptive Hunting Optimization. 2024 4th International Conference on Computer Communication and Arti cial Intelligence (CCAI), 546{551. [87] Liao, S., Zhu, Q., & Min, F. (2014). Cost-Sensitive Attribute Reduction in Decision- Theoretic Rough Set Models. Mathematical Problems in Engineering, 2014(1), 875918. [88] Liao, S., Zhu, Q., & Qian, Y. (2019). Feature{granularity selection with variable costs for hybrid data. Soft Computing, 23, 13105{13126. [89] Liao, S., Zhu, Q., Qian, Y., & Lin, G. (2018). Multi-granularity feature selection on costsensitive data with measurement errors and variable costs. Knowledge-Based Systems, 158, 25{42. [90] Lira, H., Ko, I.-Y., & Jimenez-Molina, A. (2018). Mental workload assessment in smartphone multitasking users: a feature selection approach using physiological and simulated data. 2018 IEEE/WIC/ACM International Conference on Web Intelligence (WI), 639{642. [91] Liu, M., Miao, L., & Zhang, D. (2014). Two-stage cost-sensitive learning for software defect prediction. IEEE Transactions on Reliability, 63(2), 676{686. [92] Liu, M., Xu, C., Luo, Y., Xu, C., Wen, Y., & Tao, D. (2017a). Cost-sensitive feature selection by optimizing F-measures. IEEE Transactions on Image Processing, 27(3), 1323{1335. [93] Liu, M., Xu, C., Luo, Y., Xu, C., Wen, Y., & Tao, D. (2017b). Cost-sensitive feature selection via f-measure optimization reduction. Proceedings of the AAAI Conference on Arti cial Intelligence, 31(1). [94] Long, X., Qian, W., Wang, Y., & Shu, W. (2021). Cost-sensitive feature selection on multi-label data via neighborhood granularity and label enhancement. Applied Intelligence, 51, 2210{2232. [95] Lopez, B. R. (2014). Advancing the diagnosis of dry eye syndrome: development of automated assessments of tear lm lipid layer patterns. Universidade da Coru~na. [96] Lu, H., Xu, Y., Ye, M., Yan, K., Gao, Z., & Jin, Q. (2019). Learning misclassi cation costs for imbalanced classi cation on gene expression data. BMC Bioinformatics, 20, 1{10. [97] Machiraju, S., & Gaurav, S. (2018). Power BI Data Analysis and Visualization. De-G Press. [98] Maldonado, S., Perez, J., & Bravo, C. (2017). Cost-based feature selection for support vector machines: An application in credit scoring. European Journal of Operational Research, 261(2), 656{665. [99] McCombe, N., Ding, X., Prasad, G., Finn, D. P., Todd, S., McClean, P. L., Wong- Lin, K., Initiative, N., & others. (2022). Multiple Cost Optimisation for Alzheimer's Disease Diagnosis. 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), 1098{1104. [100] Mccombe, N., Ding, X., Prasad, G., Gillespie, P., Finn, D. P., Todd, S., Mcclean, P. L., & Wong-Lin, K. (2022). Alzheimer's disease assessments optimized for diagnostic accuracy and administration time. IEEE Journal of Translational Engineering in Health and Medicine, 10, 1{9. [101] Mej-Lavalle, M. (2008). Applying cost sensitive feature selection in an electric database. International Symposium on Methodologies for Intelligent Systems, 644{649. [102] Min, F., Hu, Q., Zhu, W. (2014). Feature selection with test cost constraint. International Journal of Approximate Reasoning, 55(1), 167{179. [103] Min, F., & Xu, J. (2016). Semi-greedy heuristics for feature selection with test cost constraints. Granular Computing, 1(3), 199{211. [104] Miri, M., Dowlatshahi, M. B., & Hashemi, A. (2022). Evaluation multi label feature selection for text classi cation using weighted borda count approach. 2022 9th Iranian Joint Congress on Fuzzy and Intelligent Systems (CFIS), 1{6. [105] Miri, M., Dowlatshahi, M. B., Hashemi, A., Rafsanjani, M. K., Gupta, B. B., Alhalabi, W. (2022). Ensemble feature selection for multi-label text classi cation: An intelligent order statistics approach. International Journal of Intelligent Systems, 37(12), 11319{11341. [106] Mohanrasu, S. S., Phan, L. T., Rajan, R., & Manavalan, B. (2025). Cost-sensitive feature selection for multi-label classi cation: multi-criteria decision-making approach. Applied Computing and Informatics. [107] Momeni, N., Valdes, A. A., Rodrigues, J., Sandi, C., & Atienza, D. (2021). CAFS: costaware features selection method for multimodal stress monitoring on wearable devices. IEEE Transactions on Biomedical Engineering, 69(3), 1072{1084. [108] Namakin, M., Rouhani, M., & Sabzekar, M. (2022). An evolutionary correlation-aware feature selection method for classi cation problems. Swarm and Evolutionary Computation, 75, 101165. [109] Nguyen, B. H., Xue, B., & Zhang, M. (2020). A survey on swarm intelligence approaches to feature selection in data mining. Swarm and Evolutionary Computation, 54, 100663. [110] Niu, J., Zhao, H., & Zhu, W. (2014). A Logarithmic Weighted Algorithm for Minimal Test Cost Attribute Reduction. Rough Sets and Knowledge Technology: 9th International Conference, RSKT 2014, Shanghai, China, October 24-26, 2014, Proceedings 9, 129{138. [111] Niu, J., Zhao, H., & Zhu, W. (2016). Feature selection with test cost constraint through a simulated annealing algorithm. Journal of Internet Technology, 17(6), 1133{1140. [112] Ogawa, H., Koga, T., Pham, N.-A., Bernards, N., Gregor, A., Sata, Y., Kitazawa, S., Hiraishi, Y., Ishiwata, T., Aragaki, M., & others. (2024). Clinical and pathological predictors of engraftment for patient-derived xenografts in lung adenocarcinoma. Lung Cancer, 194, 107863. [113] Pendharkar, P. C. (2006). A data mining-constraint satisfaction optimization problem for cost e ective classi cation. Computers & Operations Research, 33(11), 3124{3135. [114] Pocock, A. C. (2012). Feature selection via joint likelihood. University of Manchester. [115] Porto Diaz, I. (2015). Novel machine learning methods based on information theory. [116] Qian, G., Yeung, D., Tsang, E. C. C., & Shu, W. (2004). Feature selection for Chinese character recognition based on inductive learning. International Journal of Pattern Recognition and Arti cial Intelligence, 18(08), 1453{1471. [117] Qian, W., Shu, W., Yang, J., &Wang, Y. (2015). Cost-sensitive feature selection on heterogeneous data. Advances in Knowledge Discovery and Data Mining: 19th Paci c-Asia Conference, PAKDD 2015, Ho Chi Minh City, Vietnam, May 19-22, 2015, Proceedings, Part II 19, 397{408. [118] Raynal, L., Ho mann, T., & Onnela, J.-P. (2023). Cost-based feature selection for network model choice. Journal of Computational and Graphical Statistics, 32(3), 1109{1118. [119] Raynal, L., & Onnela, J. P. (2021). Selection of Summary Statistics for Network Model Choice with Approximate Bayesian Computation. ArXiv Preprint ArXiv:2101.07766, 1{34. [120] Rong, M., Gong, D., & Gao, X. (2019). Feature selection and its use in big data: challenges, methods, and trends. Ieee Access, 7, 19709{19725. [121] Saeedi, R. (2018). EFFICIENT MACHINE LEARNING ALGORITHMS FOR AUTOMATIC RECONFIGURATION OF MOBILE. Washington State University. [122] Santos-Rodriguez, R., & Garcia-Garcia, D. (2010). Cost-sensitive feature selection based on the set covering machine. 2010 IEEE International Conference on Data Mining Workshops, 740{746. [123] Secerbegovic, A., Gogic, A., Suljanovic, N., Zajc, M., & Mujcic, A. (2018). Computational balancing between wearable sensor and smartphone towards energy-ecient remote healthcare monitoring. Advances in Electrical and Computer Engineering, 18(4), 3{10. [124] Seethalakshmi, T., Murugan, M., & Vimalan, P. M. A. S. (2024). Smart material selection strategies for sustainable and cost-e ective high-performance concrete production using deep learning. Automatika, 65(4), 1533{1544. [125] Sheikhpour, R., Sarram, M. A., Gharaghani, S., & Chahooki, M. A. Z. (2017). A survey on semi-supervised feature selection methods. Pattern Recognition, 64, 141{158. [126] Sheng, V. S., Ling, C. X., Ni, A., & Zhang, S. (2006). Cost-sensitive test strategies. Proceedings of the National Conference on Arti cial Intelligence, 21(1), 482. [127] Shi, S., Mahzoon, V., Lindner, G., Vucetic, S., & Miskovic, S. (n.d.). Optimizing Detector Placement, Number, and Size in Radioactive Particle Tracking Using Feature Selection. Number, and Size in Radioactive Particle Tracking Using Feature Selection. [128] Smits, P. C., & Annoni, A. (2000a). Cost-based feature selection for GIS-embedded data fusion. IGARSS 2000. IEEE 2000 International Geoscience and Remote Sensing Symposium. Taking the Pulse of the Planet: The Role of Remote Sensing in Managing the Environment. Proceedings (Cat. No. 00CH37120), 6, 2614{2616. [129] Smits, P. C., & Annoni, A. (2000b). Cost-based feature subset selection for interactive image analysis. Proceedings 15th International Conference on Pattern Recognition. ICPR-2000, 2, 386{389. [130] Srimani, P. K., & Koti, M. S. (2011). The impact of rough set approach on medical diagnosis for cost e ective feature selection. IJCR, 3(12), 175{178. [131] Sun, Y., Li, M., Li, L., Shao, H., & Sun, Y. (2021). Cost-Sensitive Classi cation for Evolving Data Streams with Concept Drift and Class Imbalance. Computational Intelligence and Neuroscience, 2021(1), 8813806. [132] Suryani, E., Setyawan, S., Putra, B. P., & others. (2022). The cost-based feature selection model for coronary heart disease diagnosis system using deep neural network. IEEE Access, 10, 29687{29697. [133] Tahir, S. F. (2016). Resource-constrained re-identi cation in camera networks. Queen Mary University of London. [134] Tan, A., Wu, W., & Tao, Y. (2017). A set-cover-based approach for the test-costsensitive attribute reduction problem. Soft Computing, 21, 6159{6173. [135] Tao, Y., Lu, G., Ma, C., Su, Z., & Hu, Z. (2021). Semi-supervised Feature Selection Based on Cost-Sensitive and Structural Information. Databases Theory and Applications: 32nd Australasian Database Conference, ADC 2021, Dunedin, New Zealand, January 29{February 5, 2021, Proceedings 32, 23{36. [136] Taylor, P., Griths, N., Hall, V., Xu, Z., & Mouzakitis, A. (2022). Feature selection for supervised learning and compression. Applied Arti cial Intelligence, 36(1), 2034293. [137] Teisseyre Pawe land Klonecki, T. (2021). Controlling costs in feature selection: information theoretic approach. Computational Science{ICCS 2021: 21st International Conference, Krakow, Poland, June 16{18, 2021, Proceedings, Part II 21, 483{496. [138] Teisseyre Pawe land Zu erey, D., & S lomka, M. (2019). Cost-sensitive classi er chains: Selecting low-cost features in multi-label classi cation. Pattern Recognition, 86, 290{319. [139] Valancius, M., Lennon, M., & Oliva, J. (2023). Acquisition Conditioned Oracle for Nongreedy Active Feature Acquisition. ArXiv Preprint ArXiv:2302.13960. [140] Vu, L., Nguyen, P., & Turaga, D. (2016). First lter: a cost-sensitive approach to malicious URL detection in large-scale enterprise networks. IBM Journal of Research and Development, 60(4), 1{4. [141] Weiss, Y., Elovici, Y., & Rokach, L. (2013). The cash algorithm-cost-sensitive attribute selection using histograms. Information Sciences, 222, 247{268. [142] Weiss, Y., Fledel, Y., Elovici, Y., & Rokach, L. (2012). Cost-sensitive detection of malicious applications in mobile devices. Mobile Computing, Applications, and Services: Second International ICST Conference, MobiCASE 2010, Santa Clara, CA, USA, October 25-28, 2010, Revised Selected Papers 2, 382{395. [143] Xiong, Y., & Zuo, R. (2017). E ects of misclassi cation costs on mapping mineral prospectivity. Ore Geology Reviews, 82, 1{9. [144] Xue, B., Zhang, M., Browne, W. N., & Yao, X. (2015). A survey on evolutionary computation approaches to feature selection. IEEE Transactions on Evolutionary Computation, 20(4), 606{626. [145] Yan, D., Qin, Z., Gu, S., Xu, H., & Shao, M. (2021). Cost-sensitive selection of variables by ensemble of model sequences. Knowledge and Information Systems, 63, 1069{1092. [146] Yang, T., Wang, X., Zheng, K., Guo, Z., Qian, Y., & Qiao, F. (n.d.). An Ecient Minimum Cost Attribute Reduction Algorithm for Large Scale Data. Available at SSRN 4927688. [147] Yu, J.-R., Lin, C.-Y., & Lien, D. D. (n.d.). A Cost-E ective and Accuracy-Sensitive L1- Norm Support Vector Machine Model for Feature Selection. Available at SSRN 4904961. [148] Yu, S., & Zhao, H. (2018). Rough sets and Laplacian score based cost-sensitive feature selection. PloS One, 13(6), e0197564. [149] Yue, L., Hu, P., Chu, S.-C., & Pan, J.-S. (2023). Multi-objective gray wolf optimizer with cost-sensitive feature selection for predicting students' academic performance in college english. Mathematics, 11(15), 3396. [150] Zahirnia, K., Teimouri, M., Rahmani, R., & Salaq, A. (2015). Diagnosis of type 2 diabetes using cost-sensitive learning. 2015 5th International Conference on Computer and Knowledge Engineering (ICCKE), 158{163. [151] Zangooei, A., Derhami, V., & Jamshidi, F. (2019). A Novel Architecture for Detecting Phishing Webpages using Cost-based Feature Selection. Journal of AI and Data Mining, 7(4), 607{616. [152] Zhang, D., & Shen, D. (2011). Multicost: multi-stage cost-sensitive classi cation of alzheimer's disease. Machine Learning in Medical Imaging: Second International Workshop, MLMI 2011, Held in Conjunction with MICCAI 2011, Toronto, Canada, September 18, 2011. Proceedings 2, 344{351. [153] Zhang, Y., Cheng, S., Shi, Y., Gong, D., & Zhao, X. (2019). Cost-sensitive feature selection using two-archive multi-objective arti cial bee colony algorithm. Expert Systems with Applications, 137, 46{58. [154] Zhang, Y., Gong, D., & Cheng, J. (2015). Multi-objective particle swarm optimization approach for cost-based feature selection in classi cation. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 14(1), 64{75. [155] Zhang, Y., Gong, D., Rong, M., & Guo, Y. (2016). Interval cost feature selection using multi-objective PSO and linear interval programming. Advances in Swarm Intelligence: 7th International Conference, ICSI 2016, Bali, Indonesia, June 25-30, 2016, Proceedings, Part I 7, 579{586. [156] Zhang, Y., Song, X., & Gong, D. (2017). A return-cost-based binary re y algorithm for feature selection. Information Sciences, 418, 561{574. [157] Zhang, Y., Zhang, J., Guo, Y., & Sun, X. (2016). Fuzzy cost-based feature selection using interval multi-objective particle swarm optimization algorithm. Journal of Intelligent & Fuzzy Systems, 31(6), 2807{2812. [158] Zhao, H., Min, F., & Zhu, W. (2013a). A backtracking approach to minimal cost feature selection of numerical data. JOURNAL OF INFORMATION &COMPUTATIONAL SCIENCE, 10(13), 4105{4115. [159] Zhao, H., Min, F., & Zhu, W. (2013b). Cost-Sensitive Feature Selection of Numeric Data with Measurement Errors. Journal of Applied Mathematics, 2013(1), 754698. [160] Zhao, H., Wang, P., & Hu, Q. (2016). Cost-sensitive feature selection based on adaptive neighborhood granularity with multi-level con dence. Information Sciences, 366, 134{149. [161] Zhao, H., & Yu, S. (2019). Cost-sensitive feature selection via the L2, 1-norm. International Journal of Approximate Reasoning, 104, 25{37. [162] Zhao, H., & Zhu, W. (2014). Optimal cost-sensitive granularization based on rough sets for variable costs. Knowledge-Based Systems, 65, 72{82. [163] Zhou, Q., Zhou, H., & Li, T. (2016). Cost-sensitive feature selection using random forest: Selecting low-cost subsets of informative features. Knowledge-Based Systems, 95, 1{11. | ||
آمار تعداد مشاهده مقاله: 80 تعداد دریافت فایل اصل مقاله: 51 |