Keywords

1 Introduction

In its simplest form, granules or information granules are building blocks of a reasoning or computational procedure in soft or hard contexts. Information granulation can be viewed as a human way of achieving complexity reduction (rather than data compression) that often plays a key role in divide-and-conquer strategies used in human problem-solving. Granulations are collections of granules that have been integrated by some processes that involve indistinguishability, similarity, proximity or functionality. Associated soft contexts typically involve vagueness, uncertainty, indecision or fuzziness and some level of indeterminacy. This has lead to many distinct mutually not-necessarily compatible approaches. For example, not all frameworks of granular computing used in general rough sets are compatible with those used in fuzzy sets.

A natural question is do granulations come first or do granules come first? If the goal is to perceive and classify objects irrespective of ontology or associated process, then the question is not particularly relevant. Some approaches to granularity as in the classical granular computing approach (CGCP) prefer to start from granulations and proceed to consider granules at multiple levels of precision. In the axiomatic approach (AGCP) [1], especially when ontology is important, it is more common to proceed from granules to granulations. But the converse approach is also relevant in AGCP. In adaptive systems, when granules are permitted to change relative to events or time or temporal instants, it makes sense to keep track of the changes through additional operators. This does suggest that a bottom up approach would be optimal in the scenario.

The number of distinct approaches to ideas of granularity depends on the perspective used. The level or qualitative description of granules involved may also be a key determiner of the perspective used. The major approaches are CGCP, AGCP, primitive granular computing paradigm and adaptive variants of the first two. Hierarchies within each of these types can also be formalized or specified.

1.1 Background

The concept of information can also be defined in many not necessarily equivalent ways. In the present author’s view anything that alters or has the potential to alter a given context in a significant positive way is information. In the contexts of general rough sets, the concept of information must have the potential to alter supervenience relations in the contexts (A set of properties Q supervene on another set of properties T if there exist no two objects that differ on Q without differing on T), be formalizable and be able to generate concepts of roughly similar collections of properties or objects. One of the popular abstractions is that of an information table.

Formally an information table \(\mathcal {I}\), is a tuple of the form

with , \(\mathbb {A}\) and \(V_{a}\) being respectively sets of Objects, Attributes and Values respectively. being the valuation map associated with attribute \(a\in \mathbb {A}\). Values may also be denoted by the binary function defined by for any \(a\in \mathbb {A}\) and , \(\nu (a, x) = f_a (x)\).

Relations may be derived from information tables by way of conditions of the following form: For and \(B\,\subseteq \, \mathbb {A} \), \((x,\,w)\,\in \, \sigma \) if and only if \((\mathbf {Q} a, b\in B)\, \Phi (\nu (a,\,x),\, \nu (b,\, w),) \) for some quantifier \(\mathbf {Q}\) and formula \(\Phi \). The relational system \(S = \left\langle \underline{S}, \sigma \right\rangle \) (with ) is said to be a general approximation space (S and \(\underline{S}\) will be used interchangeably). In particular if \(\sigma \) is an equivalence relation then S is referred to as an approximation space. It should be noted that objects are assumed to be defined (to the extent possible) by attributes and associated valuations.

In classical rough sets, on the power set \(\wp (S)\), lower and upper approximations of a subset \(A\in \wp (S)\) operators, apart from the usual Boolean operations, are defined as per: \(A^l = \bigcup _{[x]\subseteq A} [x]\), \(A^{u} = \bigcup _{[x]\cap A\ne \varnothing } [x]\), with [x] being the equivalence class generated by \(x\in S\). If \(A, B\in \wp (S)\), then A is said to be roughly included in B \((A\sqsubseteq B)\) if and only if \(A^l \subseteq B^l\) and \(A^u\subseteq B^u\). A is roughly equal to B (\(A\approx B\)) if and only if \(A\sqsubseteq B\) and \(B\sqsubseteq A\). The positive, negative and boundary region determined by a subset A are respectively \(A^l\), \({(A^{u})^c}\) and \(A^{u}\setminus A^l\) (c being the set complement).

In a general approximation space \(S = \left\langle \underline{S}, R\right\rangle \), any subset \(A\subseteq S\) will be said to be a R-block if and only if it is maximal with respect to the property \(A^2 \subseteq \, R\). The set of all R-blocks of S will be denoted by \(\mathcal {B}_R (S)\). If R is reflexive, then \(\mathcal {B}_R (S)\) is a proper cover of S. These are examples of granules. Any map \(n: H \longmapsto \wp (H)\) on a set H generates a set of granules called neighborhood granules [2] on H. These are called neighborhood maps if \(x\in n(x)\) holds for all x. Specifically, the successor neighborhood generated by a point \(x\in S\) is \([x] = \{a:\, Rax\}\) (Rax in infix form is aRx).

In any formal approach to vagueness, it is necessary to specify the environment or context of discourse, the main objects of interest, presumptions about how objects interact with the environment, and interpretation. Often people working in AI and ML refer to meta levels to partly specify this relative to what is known or assumed in the literature. This relative specification may not be always adequate (and requires elaboration) in a number of problems as indicated in [1, 3]. Specific classes of domains that require different formalism are considered in [4, 5].

In the context of general rough sets, various concepts of rough objects (including roughly equivalent objects) [1, 3, 6] with associated meta operations and rules correspond to semantic domains (or domains of discourse). In the context of relation based rough sets, the power set \(\wp (S)\) (or generalizations thereof), lower and upper approximation operators, and other usual operations, generate a semantics. The associated semantic domain in the sense of a collection of restrictions on possible objects, predicates, constants, functions and low level operations on those is referred to as the classical semantic domain (meta-C) for general rough sets [3]. In contrast, the semantic domain associated with sets of rough objects is a rough semantic domain (meta-R). Many other domains, including hybrid semantic domains, can be generated [1]. In [7], the models refer to reasoning about the power set of the set of possible order-compatible partitions of the set of rough objects in the context, while in [8], the models refer to maximal sequences of mutually distinguishable objects.

The concept of contamination was introduced in [9] and explored in [1, 3, 8] by the present author. It is always relative to the application context and can be read as a realization of the meta principle models should avoid making assumptions or simplifications that are not actualized in the application context in the contexts of human reasoning (or reasoning that involves causality as in human reasoning). A model is contaminated if and only if it does not satisfy the principle. Because of its focus on human reasoning (or reasoning that involves causality as in human reasoning), the problem of avoiding contamination may not always be important or may be solved in much weaker senses in specific application contexts of rough sets. For example, while computing attribute reducts of high dimensional noisy data, it may be more relevant to focus on quality of classification (especially when few preferences among attributes can be indicated or derived). On the other hand, while approximately designing the most tasty food for tigers under resource constraints, the addition of sodium glutamate and pepper to red meat (based on the experiences of non-vegetarian humans that possess far more sophisticated sense of taste) is not a good idea – in this scenario the approximations of tasty food are contaminated. Contamination may also be due to operations used in constructing approximations and rough objects [10].

Contamination avoidance is associated with a distinct minimalist approach that takes the semantic domains involved into account and has the potential to encompass the three principles of non-intrusive analysis. Some sources of contamination are those arising from assumptions about distribution of attributes, introduction of assumptions valid in one semantic domain into another by oversight [10], numeric functions used in rough sets (and soft computing in general) and fuzzy representation of linguistic hedges. It is essential for modeling relation between attributes [1, 6, 11, 12]. A Bayesian approach to modeling causality between attributes is proposed in [13] – the approach tries to avoid contamination to an extent.

For basics of partial algebras, see [14]. A partial algebra P is a tuple of the form \(\left\langle \underline{P},\,f_{1},\,f_{2},\,\ldots ,\, f_{n}, (r_{1},\,\ldots ,\,r_{n} )\right\rangle \) with \(\underline{P}\) being a set, \(f_{i}\)’s being partial function symbols of arity \(r_{i}\). The interpretation of \(f_{i}\) on the set \(\underline{P}\) should be denoted by \(f_{i}^{\underline{P}}\), but the superscript will be dropped in this paper as the application contexts are simple enough. If predicate symbols enter into the signature, then P is termed a partial algebraic system.

Terms are defined in the following way:

  • All variable symbols are term symbols;

  • If \(t_1, \ldots t_{{r_i}}\) are term symbols, then \(f_i(t_1, \ldots t_{{r_i}})\) is also a term symbol;

  • Nothing else is a term symbol.

When a term symbol t is interpreted on the partial algebra, then it is formally denoted by \(t^{\underline{P}}\) and referred to as a term. The distinction between the two will be left to the context in this paper.

For two terms \(s,\,t\), \(s\,{\mathop {=}\limits ^{\omega }}\,t\) shall mean, if both sides are defined then the two terms are equal (the quantification is implicit). \({\mathop {=}\limits ^{\omega }}\) is the same as the existence equality (also written as \({\mathop {=}\limits ^{e}}\)) in the present paper. \(s\,{\mathop {=}\limits ^{\omega ^*}}\,t\) shall mean if either side is defined, then the other is and the two sides are equal (the quantification is implicit). Note that the latter equality can be defined in terms of the former as

2 Mereology

Mereology is a collective term for a number of philosophical and formal theoretical approaches to parts and wholes, connectedness of objects, and variants thereof. Many of these approaches are not mutually compatible and so the discipline should be regarded as a plural one that is united by the goal to study parts and wholes [15, 16].

Five distinct phases in the development of mereology (based on significant methodological differences) are ancient, medieval, universal parthood related, early twentieth century and modern mereologies. The subject of mereology is common to most ancient cultures and philosophical debates associated concern questions related to the universality of parthood, the whole being a sum or fusion of its parts and concepts of emptiness. Many of these debates have had significant impact on subsequent developments. Gradation of wholes into strong, weak and weaker wholes, for example, can be related to debates about no component (like wheels, poles and axle) of a chariot having the property of being a chariot. A whole in which the parts exist relative to the whole and are mutually dependent on the same is said to be strong, while a weak whole is one in which parts are less united. The concept of emptiness or the empty is complicated in most mereologies and is of ancient origin.

Some important principles that may be accepted in a specific theory are the existence of mereological atoms (entities with no proper parts), atomistic compositionality (everything is ultimately composed of atoms), extensionality (no two composite wholes can have the same proper parts), and the principle of unrestricted composition (any group of objects composes a whole).

A major difference between mereology and set theory is that the latter is committed to the existence of abstract entities such as empty sets and classes. In the former, the whole can be as concrete as the part is. The idea of empty set is inadmissible in Lesniewski’s mereology, and ideally it should be studied over categories or in a formal language. In most of this tutorial, parthood will be explored over a set-theoretic framework with its associated dualism. While the sum of certain things is unique whenever it exists, at least three concepts of mereological fusion are known. The third definition of fusion is that a fusion of b’s is a sum of at least some bs. Thus a fusion of tomatoes may be the sum of all bright red ovaloid tomatoes. Variants of the third definition are used in this exposition. The fusion axiom is the principle that fusion is unrestricted. That is the principle that every plurality of objects has at least one fusion – this is not assumed.

For ground mereology, in a first order language enhanced with quantifiers, the binary parthood predicate \(\mathbf {P}\) is assumed to be reflexive, antisymmetric and transitive. Theories that start from this mereology almost always assume a lot more. In the axiomatic approach to granules, transitivity is not always assumed. So associated mereology is quite distinct. From a basic parthood predicate \(\mathbf {P}\) (irrespective of assumptions), the following derived predicates and partial operations \(\oplus , \,\cdot ,\, \ominus \) can be defined (some conditions are omitted below):

  • Overlap: \(\mathbf {O}xa\,\leftrightarrow \,(\exists z)\,\mathbf {P}zx\,\wedge \,\mathbf {P}za\)

  • Proper Part: \(\mathbb {P}xa\, \leftrightarrow \, \mathbf {P}xa\wedge \lnot \mathbf {P}ax\),

  • Overcross:\(\mathbb {X}xa \,\leftrightarrow \,\mathbf {O}xa\wedge \lnot \mathbf {P}xa\)

  • Proper Overlap: \(\mathbb {O}xa \,\leftrightarrow \,\mathbb {X}xa \,\wedge \,\mathbb {X}ax \),

  • wDifference1: \( (\forall x,a, z)(x \ominus a= z \,\rightarrow \,(\forall w)(\mathbf {P}wz\,\leftrightarrow \,(\mathbf {P}wx \wedge \lnot \mathbf {O}wa))) \)

  • Sum1: \((\forall x,y, z)(x \oplus y= z \, \rightarrow \,(\forall w)(\mathbf {O}wz\, \leftrightarrow \,(\mathbf {O}wx \vee \mathbf {O}wy))) \)

  • Product1: \((\forall x,y, z)(x \odot y= z \,\rightarrow \,(\forall w)(\mathbf {P}wz\,\leftrightarrow \,(\mathbf {P}wx \wedge \mathbf {P}wy))) \)

3 General Rough Sets and Granularity

General rough sets can be studied for different purposes from the perspective of AGCP, CGCP or non-granular perspectives and in many different ways. Ideas of granularity used in fuzzy sets (see [17]) in particular are not always compatible with those used in general rough sets. It can however be said that granules (or information granules) are basically collections sharing some properties relating to indiscernibility, similarity or functionality at some levels of discourse.

3.1 Granules and Granulations

A granule may be vaguely defined as some concrete or abstract realization of relatively simpler objects through the use of which more complex problems may be solved. They exist relative to the problem being solved in question. In the present author’s view at least some of the basic ideas of granular computing have been in use since the dawn of human evolution. In earlier papers [1, 3, 18], she has shown that the methods can be classified into the PGCP, CGCP and AGCP. Adding adaptive aspects and other time related constraints (especially for handling interactive or emergent systems [19, 20]) leads to additional categories. Because they have been considered in the perspective of CGCP, they may be regarded as extensions of the same. In all theories or theoretical understandings of granularity, the term granules refer to parts or building blocks of the computational process and granulations to collections of such granules in the context.

3.2 Primitive Granular Computing

Even in the available information on earliest human habitations and dwellings, it is possible to identify a primitive granular computing process at work. This can for example be seen from the stone houses, dating to 3500 BCE, used in what is present-day Scotland. Related details can be found in [1].

The main features of primitive granular computing are that

  • requirements associated with the problem are not rigidly specified;

  • both vague and precise granules (more often the former) may be used;

  • not much formalization is involved in the specifications (historically these become more complicated in mereological approaches) and that has never been part of the goals;

  • scope for abstraction is relatively limited and

  • the concept of granules used may be concrete or abstract (relative to all materialist and extended materialist viewpoints), but may be barely constrained by rules.

While the method may be of ancient origin, it is still used in a number of modern contexts. The diet of people living in regions close to the sea depends on seasonal fluctuations in the production of fish and other foods. These dynamics can be understood in the perspective of PGCP [1].

3.3 Classical Granular Computing Paradigm

In the context of commercial painting, different parts of navigation indicators can be painted with brushes of different sizes. The artist involved may be able to use many distinct subsets of brushes to paint the sign based on choice of style, the time required to complete the sign and quality. The entire thinking process associated with the execution of the job can be viewed from a granular computing paradigm based on approximate precision as opposed to exact precision (see [1]). One possible granular strategy in the situation is the following:

  • draw outline of sign using stencils;

  • identify sub-regions from the finest to the broadest;

  • make an initial selection of brushes;

  • paint and check the progress (and quality) of work produced, and finally

  • stop or repeat steps using more appropriate brush sizes.

The strategy used in the example falls under the classical granular computing because painting brushes have fixed size. It differs from PGCP in that the form of the sign was preconceived and the tools including brushes do not have a role in determining the conception of the product.

Security personnel, while opening the gates of a building for incoming or outgoing vehicular traffic proceed to open gates from a granular perspective of approximation of the size or width of the vehicle involved in question. Granules of varying precision may be used in the process as opposed to the kind of precision supposed in the previous example. This also suggests a different axiomatic framework being employed in the rough computation. The extent to which gates have already been opened at a particular instant also has a role in influencing subsequent moves. If switching between levels of granularity is done, then it can also be argued that the solution used falls under CGCP and not PGCP. Because adaptivity is understood from a higher order perspective and in relation to features falling outside precision, this may be read from such a viewpoint as well.

In [3], the precision based granular computing paradigm was traced to [21] and named as the classical granular computing paradigm CGCP by the present author. More correctly, it is also an ancient method that has been identified as such in [1] and elsewhere by her. CGCP is often referred to as the granular computing paradigm and has since been used in soft, fuzzy and rough set theories in different ways [22,23,24,25,26]. Some of the paradigm fragments involved in applying CGCP are:

  • PF-1: Granules can exist at different levels of precision.

  • PF-2: Among the many precision levels, a precision level at which the problem at hand is solvable should be selected.

  • PF-3: Granulations (granules at specific levels or processes) form a hierarchy (later development).

  • PF-4: It is possible to easily switch between precision levels.

  • PF-5: The problem under investigation may be represented by the hierarchy of multiple levels of granulations.

CGCP is Ancient. Many approximation methods used in mathematical practice essentially use CGCP for solving problems. Examples range from those relating algorithms for approximating \(\pi \) to finding square roots of numbers. An ancient procedure of computing square roots is the Babylonian method. It is at least 2500 years old and is essentially the following:

Babylonian Method

  • Problem: To compute \(\sqrt{a} \), \(a\in R_+\) to some desired level of accuracy (specified in relative or absolute terms).

  • Initialization: Select an arbitrary value \(a_o\) close to \(\sqrt{a}\).

  • Recursion Step: \(a_{n+1}\, =\,0.5 (a_n + \frac{a}{a_n})\) for \(n\in N\)

  • Repeat previous step

  • stop if desired accuracy is attained

The algorithm is quadratically convergent and good initialization is necessary for fast convergence. In other words some idea about possible approximate solutions is also essential. It is a special case of many other methods including the Newton-Raphson method and the modern Householder’s method. In fact, in mathematical contexts, it is possible to indicate concepts of precision in a number of ways:

  • Fixed values of initialization correspond to bounds on the precision of the solution at different cycles of computation.

  • If the precision of the solution desired is alone fixed, then wide variation in initialization would be admissible.

  • If the time required for computation is alone fixed or specified by an interval, then again wide variation in precision of initialization would be admissible.

This suggests the following problem: Can CGCP be classified or graded relative to possible ways in which the precision can be categorized?

3.4 Axiomatic Granular Computing Paradigm

The axiomatic approach to granularity essentially consists in investigations relating to axioms satisfied by granules, the very definitions of granules and associated frameworks. Emphasis on axiomatic properties of granules can be traced to papers [7, 9, 27] in the year 2007. That is, if some covers used in constructing approximations are overlooked. Neighborhoods had been investigated by a number of authors (see references in [3, 26, 28,29,30]) with emphasis on point-wise approximations. A systematic axiomatic approach to granules and granulations has been due to the present author in [3, 9]. Relatively more specific versions of this approach have rich algebraic semantics associated. Parts of the axiomatic approach developed by the present author for general rough sets have been known in some form in implicit terms. But these were not stressed in a proper way because of the partial dominance of the point-wise approach.

The axiomatic approach to granularity initiated in [9] has been developed by the present author in the direction of contamination reduction in [1, 3, 8, 10, 12]. The concept of admissible granules, mentioned earlier, was arrived in the latter paper. From the order-theoretic algebraic point of view, the deviation is in a new direction relative to the precision-based paradigm. The paradigm shift includes a new approach to measures.

In the present author’s classification, a rough approximation operator may be granular (in the axiomatic sense), co-granular, pointwise, abstract or empirical [31]. Most of the point-wise approximations in cover or relation-based approaches are co-granular. In cover based rough sets, three kinds of approximations are mentioned in [28]. Of these the subsystem based approximations would fall under the axiomatic granular approach and are not non granular. This is because in the approach, granulations are necessarily set-theoretically derived from covers (while the approximations remain a simple union of granules). By empirical approximations is meant a set of approximations that have been specified in a concrete empirical context. These may not necessarily be based on known processes or definite attributes. Examples of such approximations have been discussed by the present author in rough contexts in [3, 32].

4 High Granular Operator Spaces and Variants

Abstract frameworks for the axiomatic approach called rough Y-systems (RYS) were introduced and studied by the present author in [3] and other papers. Granular operator spaces (and variants), investigated by the present author in [1, 33, 34] in particular, are simplifications and higher order variants of RYS. They are meant for both abstract and concrete approximations that are granular in nature in the sense of the axiomatic approach, and are well suited for investigating semantic questions, representation, ontology, formulation of semantics and the inverse problem. Other abstract approaches to rough sets without any restrictions on granularity, but with additional assumptions about order structure and negations as in [35] are less related. For the connection of the present approach to the numeric function based rough mereological approach [36] the reader may refer to [1, 3, 37].

In a high general granular operator space (GGS), defined below, aggregation and co-aggregation operations (\(\vee , \,\wedge \)) are conceptually separated from the binary parthood (\(\mathbf {P}\)), and a basic partial order relation (\(\le \)). Parthood is assumed to be reflexive and antisymmetric. It may satisfy additional generalized transitivity conditions in many contexts. Real-life information processing often involves many non-evaluated instances of aggregations (fusions), commonalities (conjunctions) and implications because of laziness or supporting meta data or for other reasons – this justifies the use of partial operations. Specific versions of a GGS and granular operator spaces have been studied in [1] by the present author for handling a large spectrum of rough set contexts. GGS has the ability to handle adaptive situations as in [38] through special morphisms – this is again harder to express without partial operations.

The underlying set \(\underline{\mathbb {S}}\) may be a set of collections of attributes, objects with or without labels or anything else. In practice, the set of all attributes in a context need not be known exactly to the reasoning agent constructing the approximations. The element \(\top \) may be omitted in these situations or the issue can be managed through restrictions on the granulation. Also, it often happens that certain objects cannot be approximated in an acceptable way. Therefore, it can be argued that the approximations operations used should be partial. Related abstractions (Pre-GGS) are not discussed in this tutorial.

Definition 1

A High General Granular Operator Space (GGS) \(\mathbb {S}\) shall be a partial algebraic system of the form \(\mathbb {S} \, =\, \left\langle \underline{\mathbb {S}}, \gamma , l , u, \mathbf {P}, \le , \vee , \wedge , \bot , \top \right\rangle \) with \(\underline{\mathbb {S}}\) being a set, \(\gamma \) being a unary predicate that determines \(\mathcal {G}\) (by the condition \(\gamma x\) if and only if \(x\in \mathcal {G}\)) an admissible granulation(defined below) for \(\mathbb {S}\) and lu being operators \(:\underline{\mathbb {S}}\longmapsto \underline{\mathbb {S}}\) satisfying the following (\(\underline{\mathbb {S}}\) is replaced with \(\mathbb {S}\) if clear from the context. \(\vee \) and \(\wedge \) are idempotent partial operations and \(\mathbf {P}\) is a binary predicate. Further \(\gamma x\) will be replaced by \(x \in \mathcal {G}\) for convenience.):

figure a

Let \(\mathbb {P}\) stand for proper parthood, defined via \(\mathbb {P}ab\) if and only if \( \mathbf {P}ab \, \& \,\lnot \mathbf {P}ba\)). A granulation is said to be admissible if there exists a term operation t formed from the weak lattice operations such that the following three conditions hold:

figure b

The conditions defining admissible granulations mean that every approximation is somehow representable by granules in a algebraic way, that every granule coincides with its lower approximation (granules are lower definite), and that all pairs of distinct granules are part of definite objects (those that coincide with their own lower and upper approximations). Special cases of the above are defined next.

Definition 2

  • In a GGS, if the parthood is defined by \(\mathbf {P}ab\) if and only if \(a \le b\) then the GGS is said to be a high granular operator space GS.

  • A higher granular operator space (HGOS) \(\mathbb {S}\) is a GS in which the lattice operations are total.

  • In a higher granular operator space, if the lattice operations are set theoretic union and intersection, then the HGOS will be said to be a set HGOS.

In [39], it is shown that the binary predicates can be replaced by partial two-place operations and \(\gamma \) is replaceable by a total unary operation. The results in a semantically equivalent partial algebra called a high granular operator partial algebra (GGSp).

Example 1

Suppose the problem at hand is to represent the knowledge of a specialist in automobile engineering and production lines in relation to a database of cars, car parts, calibrated motion videos of cars and performance statistics. The database is known to include a number of experimental car models and some sets of cars have model names, or engines or other crucial characteristics associated. Let \(\underline{\mathbb {S}}\) be the set of cars, some subsets of cars, sets of internal parts and components of many cars. \(\mathcal {G}\) be the set of internal parts and components of many cars. Further let

  • \(\mathbf {P}a b\) express the relation that a is a possible component of b or that a belongs to the set of cars indicated by b or that

  • \(a \le b\) indicate that b is a better car than a relative to a certain fixed set of features,

  • \(a^l\) indicate the closest standard car model whose features are all included in a or set of components that are included in a,

  • \(a^u \) indicate the closest standard car model whose features are all included by a or fusion of set of components that include a

  • \(\vee \), \(\wedge \) can be defined as partial operations, while \(\bot \) and \(\top \) can be specified in terms of attributes.

Under the conditions, \(\mathbb {S} \, =\, \left\langle \underline{\mathbb {S}}, \mathcal {G}, l , u, \mathbf {P}, \le , \vee , \wedge , \bot , \top \right\rangle \) forms a GGS. If the specialist has updated her knowledge over time, then this transformation can be expressed with the help of morphisms from a GGS to itself.

Granular operator spaces and variants (specifically high granular operator spaces) adhere to the weak definitions of granularity as per the axiomatic granular approach, do not assume a negation operation, their universe may be a collection of rough objects (in some sense), or a mix of rough and non rough objects or even a collection of all objects involved, the sense of parthood between objects is assumed to be distinct from other order relations, permit realistic partial aggregation and commonality operations, and numeric simplified measures are not assumed in general. These features are motivated by properties satisfied by models in real reasoning contexts, and help in avoiding contamination to a substantial extent.

4.1 Granularity Axioms

Even when additional lower and upper approximation operators are added to a GGS, the resulting framework will still be referred to as a GGS. In such a framework, granules definitely satisfy some of the following list of axioms (that are not assumed to be exhaustive). It is assumed that a finite number of lower (\(\{l_i \}_{i=1}^n\)) and upper (\(\{u_i \}_{i=1}^n\)) approximations are used. These have been grouped based on their role relative to approximations and ontology, and are known to have a central role in defining possible concepts of granules. For readability, the interpretations of the predicate \(\gamma \) are written out explicitly.

Representation Related Axioms. The central idea expressed by these axioms is that approximations are formed from granules through set theoretic or more general operations on granules that may be derived from the parthood relation used. In classical rough sets, every approximation is a union of equivalence classes (the granules). If \(+\) is an aggregation operation (possibly related to the parthood used)

In the weaker versions below, approximations are assumed to be representable by derived terms instead of through aggregation of granules.

The prefix sub as in Sub RA is used to indicate situations, where only a subset of approximations happen to be representable.

Crispness Axioms. As indicated before an object is crisp in a sense if it is its own approximation in that sense. This is quite different from claiming that crisp objects are those that cannot be approximated by any other object. While crispness of granules is not a given, they may possibly satisfy the following crispness axioms:

Crispness Variants: By analogy, the crispness variants sub crispness (SCG), lower absolute crispness (LACG), upper absolute crispness (UACG), lower sub crisness (LSCG), and upper sub crispness (USCG) can be defined as for representation related axioms.

Mereological Axioms. The axioms for mereological properties of granules is presented next. The axiom of mereological atomicity says that no definite elements (relative to any permitted pair of lower and upper approximations) can be proper parts of granules.

The axiom of sub-mereological atomicity says that no definite elements (relative to at least one specific pair of lower and upper approximations) can be proper parts of granules, while the axiom of inward-mereological atomicity says that no definite elements (relative to every permitted pair of lower and upper approximations) can be proper parts of granules.

Stability Axioms. Stability of granules is that granules should preserve appropriate parthood relations relative to approximations. Lower stability, defined below, says that if a granule is part of an object, then the granule should still be part of the lower approximation of the object. In general, the same does not hold for all objects. Some stability axioms are

figure c

Overlap Axioms. The possible implications of the mereological overlap and underlap relations between granules is captured by these axioms. Some of these are

figure d

Idempotence Axioms. Idempotence of approximation operators relative to granules are indicated by axioms such as

The pre-similarity axiom concerns the relation of commonalities between granules and parthood. It is redundant for classical rough sets with granules being the equivalence relations.

Apparently the three axioms WRA, LS, LU hold in most of the known theories and with most choices of granules. This has been the main motivation for the definition of admissibility of a subset to be regarded as a granule in [3] and in the definition of GGS.

4.2 Specific Cases

Few examples that partially justify the formalism of the axioms are presented next. More details can be found in [1, 3]. Let \(S = \left\langle \underline{S}, R\right\rangle \) be a general approximation space, with granulation being \(\mathcal {G}\) - the set of successor neighborhoods and

Theorem 1

  • If R is an equivalence, then all of RA, ACG, MER, AS, FU, NO, PS, I, ST hold, but UU does not hold in general.

  • If R is a partial equivalence relation (symmetric, transitive and partially reflexive relation), RA, MER, NO, UU, US hold, but ACG may not.

  • If R is a reflexive relation, then RA, LFU holds, but none of MER, ACG, LI, UI, NO, FU holds in general.

Let \(\left\langle S,(R_{i})_i\in K \right\rangle \) be a multiple approximation space [40], then apart from the strong lower, weak lower, strong upper and weak upper approximations discussed in the paper a hierarchy of approximations can be defined and related properties can be studied [1].

In the perspective of the axiomatic approach, the next definition is natural:

Definition 3

A specific mathematical approach to relation-based rough set is granular only if it can be rewritten in the form of a general granular operator space or a higher order granular operator space satisfying additional conditions.

Some representation theorems that connect GGS with general approximation spaces are known and more are of natural interest [1].

5 Knowledge Representation and Granularity

From a theory of knowledge and application perspective, it is of much interest to study definitions, representation, ontology and relative consistency of knowledge among other things. Ontological correspondences between knowledge in different contexts, and problems of conflict representation and resolution are also of interest. The framework of high granular operator spaces (and partial algebras) can represent knowledge in a far more substantial way than is afforded by non granular extensions of the situation in classical rough sets. More so because it is easily extensible with ontology.

In classical rough sets, if \(S= \left\langle \underline{S}, R\right\rangle \) is an approximation space, then approximations of subsets of \(\underline{S}\) the form \(A^{l}\) and \(A^{u}\) represent clear and definite concepts [41]. Further every equivalence class interpreted as a granule is definite. R in this perspective encodes knowledge by way of the distribution of definite objects. If Q is another stronger equivalence (\(Q\,\subseteq \,P\)) on \(\underline{S}\), then the state of the knowledge encoded by \(\left\langle \underline{S},\,Q \right\rangle \) is a refinement of that of \(S=\left\langle \underline{S},\,P\right\rangle \). Subsequent work on logics and semantics for comparing different types of knowledge and measures of relative consistency can be found in [42,43,44] and elsewhere.

This knowledge interpretation has been extended in a natural granular way to general approximation spaces by the present author in [9, 11]. In [9], choice operations are used over granules in the context tolerances spaces for the construction of definite objects that correspond to clear concepts or beliefs with ontology. The upper approximation of an object may be a proper part of the upper approximation of the upper approximation of the same object in proto-transitive rough sets considered in [11]. This itself has an impact on the granular axioms satisfied.

In general some axioms of interest are

K1 :

All Granules are atomic units of knowledge.

K2 :

Knowledge is characterized by granules.

K3 :

Maximal collections of granules subject to a concept of mutual independence are admissible concepts of knowledge.

K4 :

Parts common to subcollections of maximal collections of granules are also knowledge.

K5 :

Knowledge \(K_1\) is fully consistent with another knowledge \(K_2\) if and only if both generate the same granules.

K6 :

Knowledge \(K_1\) is fully inconsistent with another knowledge \(K_2\) if and only if no granule of one is included in a granule of the other.

K7 :

Some Granules are atomic units of knowledge.

K8 :

Every atomic unit of knowledge is a granule.

K9 :

Some collections of granules form a consistent unit of knowledge.

These axioms are not necessarily true in every context and stand to benefit much from additional ontologies that can specify rules of combination. This in turn makes the different semantic models that generalize high granular operator spaces (and partial algebras) all the more relevant [1, 39, 45].