[sc34wg3] Subject Schizophrenia

Murray Altheim sc34wg3@isotopicmaps.org
Wed, 25 Aug 2004 12:12:09 +0100

Bernard Vatant wrote:
> The following message in SWAD-Europe forum might be of interest in this context
> http://lists.w3.org/Archives/Public/public-esw-thes/2004Aug/0048.html
> Quoting the author Stella Dextre Clarke:
> "Just a little health warning: concepts are not so cut and dried as you
> might expect. They are slippery customers, and it is often impractical
> to decide when a concept has been "changed" or when the thesaurus entry
> has changed without changing the concept."
> One could read here, I guess, a more general "proxy" instead of "thesaurus entry" ...
> In the current debate, folks seem to take for granted that subjects have some absolute
> existence and absolute properties that could be represented/proxified/reified/indicated
> (pick your choice, I don't mind) or otherwise "captured" in a topic map or any other kind
> of representation/proxification/reification/indication, but which somehow exist out there
> in the blue, independently of and beyond any
> representation/proxification/reification/indication or other kind of capture.


If I understand you correctly, you and I are in full agreement about the status of
the concept of Cartesian "subject". I believe that the philosophical community has
provided a number of different solutions to the problem, probably the earlier would
be C.S. Peirce's Firstness, Secondness and Thirdness, the other the whole school of
thought following the later Wittgenstein, the one that I suggested is currently
being promulgated by the likes of Brandom. I.e., that if we stop *worrying* about
whether there is some platonic set of categories out there that we are trying to
find (e.g., Sowa believes we just need to work harder and we will), and consider
instead the creation of ontologies as acts of human communication, then the problem
doesn't exactly *go away*, but changes from trying to capture the "true" essense of
reality to simply people communicating ideas back and forth (or froth, as I'd
originally mistyped).

> This implicit assumption is at least questionable, as the above reminder points out. My
> personal opinion on that is strongly biased both by my mathematics and hard science
> background, where I've learnt that "you never know what you are talking about, nor if what
> you say is right or wrong", and by too many readings in so-called "oriental" philosophy,
> which basically say the same kind of thing, as far as I understood them. So I tend to
> think that
> 1. This assumption is completely wrong (although I am ready to agree with a consensus
> viewpoint that it is at least undecidable),
> 2. To enable any kind of language, knowledge and communication, we generally act "as if"
> the subjects exist,
> 3. Actually we always deal with representations or proxies without being able to know is
> there is something to be represented and proxified, let alone that this something have
> properties isomorphic to the ones we have given to the proxies,
> 4. Proxies can have a longer life than whatever they are supposed to proxify. Their
> changing subject, if any, is better defined by the way they are used than by anything
> else, and is notoriously subtly changing over time, as Stella very well describes in the
> quoted post.
> So I suggest to avoid any explicit reference to this questionable subject, and stick to
> the proxy level we are able to manage, by replacing:
> "Those two proxies proxify the same subject"
> by a more agnostic
> "Those two proxies are equivalent"
> (if you really believe in subjects, read "the subject is the same" if you like)

Another approach, as according to my above suggestion, would be (akin to):

   "I am making the statement that those two proxies proxify the same subject."

or perhaps

   "Please consider that I believe these two proxies as referring to the same subject."

Which, as I believe you'd agree, is what happens *anyway*. We never know absolutely,
we just make statements as if we do.

> The best we can achieve then is to define agreement on
> - Rules under which two proxies have to be considered as equivalent in a specific model :
> this is TMDM.
> - Language enabling the expression of such rules for any other model : this is TMRM.
> - Process that may/should/must be applied when two proxies are found to be equivalent
> under such rules : this is Processing Model, which might be or not in the scope of the
> standard (I tend to think it should not)
> Martin's remark tends also to make me think that two proxies can be found to be equivalent
> under certain facets, or at a certain level of granularity, or under specific rules, and
> distinct under other facets or at a finer level of granularity, or under other rules. Of
> course, this is questioning the very foundation of the TM fundamental objective : one
> proxy <=> one subject ...

Past communications regarding facets lead me to believe that the idea
was to create a way to establish subject-ivity not by declaration but
by an accumulation of known properties (facets), as we find in Faceted
Classification. I don't believe this questions the foundations at all,
as I consider FC not as a challenge to the concept of canonical subjects
but merely as a sort of query or categorization methodology. One of my
minor breakthroughs in developing TM-based ontologies was that there's
over the past few decades a great deal of work done in creating
computer-based ontology expression languages, and in specific imple-
mentations (such as Cyc), yet almost nothing done to solve the problem
that large ontologies create, i.e., that they typically do not provide
categorization systems aimed at assisting in navigation. I believe this
is because people are expected to use the taxonomic structures already
there. My little breakthrough was to look to the library community's
Faceted Classification as a further layer of metadata (if you like)
on top of an existing ontology, primarily to assist in navigation and

In short, I don't see facets as a problem but as a solution. One can
have and use TM-based subjects by declaring them (as in, making a
statement), or by discerning them by combinations of facets at whatever
level of granularity makes sense for the purpose.


Murray Altheim                    http://kmi.open.ac.uk/people/murray/
Knowledge Media Institute
The Open University, Milton Keynes, Bucks, MK7 6AA, UK               .

   "[The US Dept. of] Justice offered up to $67 million to ChoicePoint
    in a no-bid deal for computer profiles with private information
    on every citizen of half a dozen nations. While the September 11th
    highjackers came from Saudi Arabia, Egypt, Lebanon and the Arab
    Emirates, ChoicePoint's menu offered records on Venezuelans,
    Brazilians, Nicaraguans, Mexicans and Argentines. [...] What do
    these nations have in common besides a lack of involvement in the
    September 11th attacks? Coincidentally, each is in the throes of
    major electoral contests in which the leading candidates have the
    nerve to challenge the globalization demands of George W. Bush."
                                 -- Venezuela Floridated, Greg Palast