dc.rights.license | Attribution 4.0 International | * |
dc.contributor.author | Abu-Dakka, Fares J. | |
dc.contributor.other | Omari, Sara | |
dc.contributor.other | Omari, Adil | |
dc.contributor.other | Abderrahim, Mohamed | |
dc.date.accessioned | 2024-10-16T15:14:55Z | |
dc.date.available | 2024-10-16T15:14:55Z | |
dc.date.issued | 2024 | |
dc.identifier.issn | 2313-7673 | en |
dc.identifier.other | https://katalogoa.mondragon.edu/janium-bin/janium_login_opac.pl?find&ficha_no=178133 | en |
dc.identifier.uri | https://hdl.handle.net/20.500.11984/6667 | |
dc.description.abstract | Individuals grappling with severe central nervous system injuries often face significant challenges related to sensorimotor function and communication abilities. In response, brain–computer interface (BCI) technology has emerged as a promising solution by offering innovative interaction methods and intelligent rehabilitation training. By leveraging electroencephalographic (EEG) signals, BCIs unlock intriguing possibilities in patient care and neurological rehabilitation. Recent research has utilized covariance matrices as signal descriptors. In this study, we introduce two methodologies for covariance matrix analysis: multiple tangent space projections (M-TSPs) and Cholesky decomposition. Both approaches incorporate a classifier that integrates linear and nonlinear features, resulting in a significant enhancement in classification accuracy, as evidenced by meticulous experimental evaluations. The M-TSP method demonstrates superior performance with an average accuracy improvement of 6.79% over Cholesky decomposition. Additionally, a gender-based analysis reveals a preference for men in the obtained results, with an average improvement of 9.16% over women. These findings underscore the potential of our methodologies to improve BCI performance and highlight gender-specific performance differences to be examined further in our future studies. | en |
dc.language.iso | eng | en |
dc.publisher | MDPI | en |
dc.rights | © 2024 The Authors | en |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | * |
dc.subject | interfaces | en |
dc.subject | motor imagery | en |
dc.subject | tangent space | en |
dc.subject | gender-based analysis | en |
dc.title | EEG motor imagery classification: tangent space with gate-generated weight classifier | en |
dcterms.accessRights | http://purl.org/coar/access_right/c_abf2 | en |
dcterms.source | Biomimetics | en |
local.contributor.group | Robótica y automatización | es |
local.description.peerreviewed | true | en |
local.identifier.doi | https://doi.org/10.3390/biomimetics9080459 | en |
local.contributor.otherinstitution | https://ror.org/03ths8210 | en |
oaire.format.mimetype | application/pdf | en |
oaire.file | $DSPACE\assetstore | en |
oaire.resourceType | http://purl.org/coar/resource_type/c_6501 | en |
oaire.version | http://purl.org/coar/version/c_970fb48d4fbd8a85 | en |
oaire.funderName | Gobierno Vasco | en |
oaire.funderName | Gobierno Vasco | en |
oaire.funderIdentifier | https://ror.org/00pz2fp31 / http://data.crossref.org/fundingdata/funder/10.13039/501100003086 | en |
oaire.fundingStream | Elkartek 2022 | en |
oaire.fundingStream | Elkartek 2023 | en |
oaire.awardNumber | KK-2022-00024 | en |
oaire.awardNumber | KK-2023-00055 | en |
oaire.awardTitle | Producción Fluída y Resiliente para la Industria inteligente (PROFLOW) | en |
oaire.awardTitle | Tecnologías de Inteligencia Artificial para la percepción visual y háptica y la planificación y control de tareas de manipulación (HELDU) | en |
oaire.awardURI | Sin información | en |
oaire.awardURI | Sin información | en |