Goto

Collaborating Authors

 relational equation


Handling the inconsistency of systems of $\min\rightarrow$ fuzzy relational equations

Baaj, Ismaïl

arXiv.org Artificial Intelligence

In this article, we study the inconsistency of systems of $\min-\rightarrow$ fuzzy relational equations. We give analytical formulas for computing the Chebyshev distances $\nabla = \inf_{d \in \mathcal{D}} \Vert \beta - d \Vert$ associated to systems of $\min-\rightarrow$ fuzzy relational equations of the form $\Gamma \Box_{\rightarrow}^{\min} x = \beta$, where $\rightarrow$ is a residual implicator among the G\"odel implication $\rightarrow_G$, the Goguen implication $\rightarrow_{GG}$ or Lukasiewicz's implication $\rightarrow_L$ and $\mathcal{D}$ is the set of second members of consistent systems defined with the same matrix $\Gamma$. The main preliminary result that allows us to obtain these formulas is that the Chebyshev distance $\nabla$ is the lower bound of the solutions of a vector inequality, whatever the residual implicator used. Finally, we show that, in the case of the $\min-\rightarrow_{G}$ system, the Chebyshev distance $\nabla$ may be an infimum, while it is always a minimum for $\min-\rightarrow_{GG}$ and $\min-\rightarrow_{L}$ systems.


Chebyshev distances associated to the second members of systems of Max-product/Lukasiewicz Fuzzy relational equations

Baaj, Ismaïl

arXiv.org Artificial Intelligence

In this article, we study the inconsistency of a system of $\max$-product fuzzy relational equations and of a system of $\max$-Lukasiewicz fuzzy relational equations. For a system of $\max-\min$ fuzzy relational equations $A \Box_{\min}^{\max} x = b$ and using the $L_\infty$ norm, (Baaj, 2023) showed that the Chebyshev distance $\Delta = \inf_{c \in \mathcal{C}} \Vert b - c \Vert$, where $\mathcal{C}$ is the set of second members of consistent systems defined with the same matrix $A$, can be computed by an explicit analytical formula according to the components of the matrix $A$ and its second member $b$. In this article, we give analytical formulas analogous to that of (Baaj, 2023) to compute the Chebyshev distance associated to the second member of a system of $\max$-product fuzzy relational equations and that associated to the second member of a system of $\max$-Lukasiewicz fuzzy relational equations.


Max-min Learning of Approximate Weight Matrices from Fuzzy Data

Baaj, Ismaïl

arXiv.org Artificial Intelligence

In this article, we study the approximate solutions set $\Lambda_b$ of an inconsistent system of $\max-\min$ fuzzy relational equations $(S): A \Box_{\min}^{\max}x =b$. Using the $L_\infty$ norm, we compute by an explicit analytical formula the Chebyshev distance $\Delta~=~\inf_{c \in \mathcal{C}} \Vert b -c \Vert$, where $\mathcal{C}$ is the set of second members of the consistent systems defined with the same matrix $A$. We study the set $\mathcal{C}_b$ of Chebyshev approximations of the second member $b$ i.e., vectors $c \in \mathcal{C}$ such that $\Vert b -c \Vert = \Delta$, which is associated to the approximate solutions set $\Lambda_b$ in the following sense: an element of the set $\Lambda_b$ is a solution vector $x^\ast$ of a system $A \Box_{\min}^{\max}x =c$ where $c \in \mathcal{C}_b$. As main results, we describe both the structure of the set $\Lambda_b$ and that of the set $\mathcal{C}_b$. We then introduce a paradigm for $\max-\min$ learning weight matrices that relates input and output data from training data. The learning error is expressed in terms of the $L_\infty$ norm. We compute by an explicit formula the minimal value of the learning error according to the training data. We give a method to construct weight matrices whose learning error is minimal, that we call approximate weight matrices. Finally, as an application of our results, we show how to learn approximately the rule parameters of a possibilistic rule-based system according to multiple training data.