Which Reality?
Quote from Erwin on December 3, 2021, 6:37 pmQuote from Paul Vernon on December 3, 2021, 10:40 amQuote from dandl on December 3, 2021, 12:23 am17 and 24 are not values of a type, they are entities in their own right with their own attributes and features.
Yes. That has been my point in the Which type? topic.
17
is a value full stop. It does not "carry with it", even conceptually, some identification of the type to which it belongs.Database Explorations contains a chapter in which this issue is touched on. The number 17 is labeled an 'individual' in that treatment, to be seen as something distinct from "values of types", which is what TTM refers to when it says that the value INT(17) is not the same value as, say, LONG(17). Types are defined by collecting 'individuals' into sets called, say, 'INT' and 'LONG'. INT(17) is the individual '17' as included in type INT, LONG(17) is the individual '17' as included in type LONG. And then the question is 'is there added value in a language having operators that "recognize" whether or not two distinct values 'hold' the same 'individual'. And ***don't*** let yourself be blinded by the given example. There could also be ANGLE(17), TEMP(17), LENGTH(17), ... and the purpose of how TTM wants its languages to use the type system is to facilitate type safety checked by the compiler, meaning, essentially : no coercions.
I can see how the "distinction" between INT(17) and LONG(17) can be regarded/experienced/... as "less relevant" than a similar "distinction" between ANGLE(17) and TEMP(17) (because INT and LONG don't really add the same kind of semantic information that ANGLE and TEMP do - and it is precisely this "addition" of semantic information that makes the coercion undesirable), so perhaps there's an "extending the type system to capture more meaning" trap here and (equally perhaps) it's not even known in general whether this is something to be avoided at all cost or contrarily something to be eagerly stepped into. (So perhaps the authors kept limping on both ideas simultaneously because neither of them could get his mind made up either.)
Quote from Paul Vernon on December 3, 2021, 10:40 amQuote from dandl on December 3, 2021, 12:23 am17 and 24 are not values of a type, they are entities in their own right with their own attributes and features.
Yes. That has been my point in the Which type? topic.
17
is a value full stop. It does not "carry with it", even conceptually, some identification of the type to which it belongs.
Database Explorations contains a chapter in which this issue is touched on. The number 17 is labeled an 'individual' in that treatment, to be seen as something distinct from "values of types", which is what TTM refers to when it says that the value INT(17) is not the same value as, say, LONG(17). Types are defined by collecting 'individuals' into sets called, say, 'INT' and 'LONG'. INT(17) is the individual '17' as included in type INT, LONG(17) is the individual '17' as included in type LONG. And then the question is 'is there added value in a language having operators that "recognize" whether or not two distinct values 'hold' the same 'individual'. And ***don't*** let yourself be blinded by the given example. There could also be ANGLE(17), TEMP(17), LENGTH(17), ... and the purpose of how TTM wants its languages to use the type system is to facilitate type safety checked by the compiler, meaning, essentially : no coercions.
I can see how the "distinction" between INT(17) and LONG(17) can be regarded/experienced/... as "less relevant" than a similar "distinction" between ANGLE(17) and TEMP(17) (because INT and LONG don't really add the same kind of semantic information that ANGLE and TEMP do - and it is precisely this "addition" of semantic information that makes the coercion undesirable), so perhaps there's an "extending the type system to capture more meaning" trap here and (equally perhaps) it's not even known in general whether this is something to be avoided at all cost or contrarily something to be eagerly stepped into. (So perhaps the authors kept limping on both ideas simultaneously because neither of them could get his mind made up either.)
Quote from Paul Vernon on December 3, 2021, 10:22 pmQuote from Erwin on December 3, 2021, 6:37 pmDatabase Explorations contains a chapter in which this issue is touched on. The number 17 is labeled an 'individual' in that treatment, to be seen as something distinct from "values of types", which is what TTM refers to when it says that the value INT(17) is not the same value as, say, LONG(17).
OK, Wow. I had not read that part of DE (page 35 - or 53 of the PDF). My inadequate defence is that I only have the PDF version which is maybe less conducive to full study (but better for searching..) than a paper copy. I guess I got to the Relation Types part of that chapter and I let my interest wain. I apologise to the forum for that!
It is certainly pleasing to see the “scalar values shall ... carry with them, at least conceptually, some identification of the type to which they belong.” wordage of RM PRE2 be stated much more formally. After a quick read I can say that the "underlying model" is pretty much what I had in my head, and as such, it does not change my position on the desirably of such a model. It might help with terminology however. I guess I would say that we don't need types, we just need individuals (or atoms, or ur-element to use set theory terms).
I would also say that for me, the individual
17
is not something that can be a temperature. A temperature individual would be something like17 K
(i.e. 17 kelvin) and a similar comment goes for length and angle.I would also be tempted to say what on earth is a LONG? But unfortunately I'm not a layman and I do know to what you refer. Well, I'm guessing it is Java's:
The
long
data type is a 64-bit two's complement integer.and not (cough) Oracle's
You use the
LONG
datatype to store variable-length character strings. TheLONG
datatype is like theVARCHAR2
datatype, except that the maximum size of aLONG
value is 32760 bytes.
Quote from Erwin on December 3, 2021, 6:37 pm
Database Explorations contains a chapter in which this issue is touched on. The number 17 is labeled an 'individual' in that treatment, to be seen as something distinct from "values of types", which is what TTM refers to when it says that the value INT(17) is not the same value as, say, LONG(17).
OK, Wow. I had not read that part of DE (page 35 - or 53 of the PDF). My inadequate defence is that I only have the PDF version which is maybe less conducive to full study (but better for searching..) than a paper copy. I guess I got to the Relation Types part of that chapter and I let my interest wain. I apologise to the forum for that!
It is certainly pleasing to see the “scalar values shall ... carry with them, at least conceptually, some identification of the type to which they belong.” wordage of RM PRE2 be stated much more formally. After a quick read I can say that the "underlying model" is pretty much what I had in my head, and as such, it does not change my position on the desirably of such a model. It might help with terminology however. I guess I would say that we don't need types, we just need individuals (or atoms, or ur-element to use set theory terms).
I would also say that for me, the individual 17
is not something that can be a temperature. A temperature individual would be something like 17 K
(i.e. 17 kelvin) and a similar comment goes for length and angle.
I would also be tempted to say what on earth is a LONG? But unfortunately I'm not a layman and I do know to what you refer. Well, I'm guessing it is Java's:
The long
data type is a 64-bit two's complement integer.
and not (cough) Oracle's
You use the LONG
datatype to store variable-length character strings. The LONG
datatype is like the VARCHAR2
datatype, except that the maximum size of a LONG
value is 32760 bytes.
Quote from Erwin on December 3, 2021, 11:49 pmQuote from Paul Vernon on December 3, 2021, 10:22 pmI guess I would say that we don't need types, we just need individuals (or atoms, or ur-element to use set theory terms).
I would also say that for me, the individual
17
is not something that can be a temperature. A temperature individual would be something like17 K
(i.e. 17 kelvin) and a similar comment goes for length and angle.I would also be tempted to say what on earth is a LONG? But unfortunately I'm not a layman and I do know to what you refer. Well, I'm guessing it is Java's:
The
long
data type is a 64-bit two's complement integer.and not (cough) Oracle's
You use the
LONG
datatype to store variable-length character strings. TheLONG
datatype is like theVARCHAR2
datatype, except that the maximum size of aLONG
value is 32760 bytes.
Well, somehow my gut feel says "we don't need types, we just need individuals" is what COBOL is based on (and perhaps vilainously, saying the same in java would look like be 'we don't need classes, we just need Object'). "There's only numbers and text" - that sort of stuff. Every modern language goes beyond that because there is ***value*** to be had in going there. Just seems like nobody knows how far beyond is "too far".
That the "individual 17 is not something that can be a temperature" is sort of what I already hinted at, by hinting that "making it a member of the set of possible temperatures" is in fact an act of adding the semantics "we're talking about temperatures here" to the "bare" individual. So in a sense you seem to agree except perhaps on the wording needed to express it.
As for the definition of what a 'long' is in java, I suppose you can recognize the dependence of that definition on physical encoding, whereas the more mathematical (and thus more appropriate) version would be that it is an integer number in the range [x - y) (sorry for not bothering to look up the exact numbers - those are not the point).
Quote from Paul Vernon on December 3, 2021, 10:22 pmI guess I would say that we don't need types, we just need individuals (or atoms, or ur-element to use set theory terms).
I would also say that for me, the individual
17
is not something that can be a temperature. A temperature individual would be something like17 K
(i.e. 17 kelvin) and a similar comment goes for length and angle.I would also be tempted to say what on earth is a LONG? But unfortunately I'm not a layman and I do know to what you refer. Well, I'm guessing it is Java's:
The
long
data type is a 64-bit two's complement integer.and not (cough) Oracle's
You use the
LONG
datatype to store variable-length character strings. TheLONG
datatype is like theVARCHAR2
datatype, except that the maximum size of aLONG
value is 32760 bytes.
Well, somehow my gut feel says "we don't need types, we just need individuals" is what COBOL is based on (and perhaps vilainously, saying the same in java would look like be 'we don't need classes, we just need Object'). "There's only numbers and text" - that sort of stuff. Every modern language goes beyond that because there is ***value*** to be had in going there. Just seems like nobody knows how far beyond is "too far".
That the "individual 17 is not something that can be a temperature" is sort of what I already hinted at, by hinting that "making it a member of the set of possible temperatures" is in fact an act of adding the semantics "we're talking about temperatures here" to the "bare" individual. So in a sense you seem to agree except perhaps on the wording needed to express it.
As for the definition of what a 'long' is in java, I suppose you can recognize the dependence of that definition on physical encoding, whereas the more mathematical (and thus more appropriate) version would be that it is an integer number in the range [x - y) (sorry for not bothering to look up the exact numbers - those are not the point).
Quote from dandl on December 3, 2021, 11:59 pmQuote from Paul Vernon on December 3, 2021, 10:40 amQuote from dandl on December 3, 2021, 12:23 am17 and 24 are not values of a type, they are entities in their own right with their own attributes and features.
Yes. That has been my point in the Which type? topic.
17
is a value full stop. It does not "carry with it", even conceptually, some identification of the type to which it belongs.Sorry, but not even close. In this thought experiment 17 is a label, and 'value' is one of the things it is associated with in our minds. It could be a GUID, or it could be a Sparse Data Representation see eg https://en.wikipedia.org/wiki/Sparse_approximation. It carries nothing around with it; all its attributes are encoded as connections to other SDRs.
It is not the type of a value that is important, it is the name we associate with a value that gives its meaning. So, using ordered pairs (i.e. attributes - which I write as
name:value
), your examples above (well some of) becomeAgain no, not even close. These attributes should all be read as sentences, linking together multiple labels in a chain to form the overall concept.
17 >> age >> years >> range >> oldest >> person >> car driver >> not
Like I said: this is not computing as we know it, but it could be how brains work (higher animals as well as ours).
Well actually I would like a system that can (automatically) recognise that
{"oldest age you can be and can't vote":17 years}
and{"oldest age in years you can be and can't vote":17}
are (to a greater or lesser extent) equivalent.As it happens, that's exactly the thing that SDRs do well.
Which is another way of me saying that I don't agree with the position that attribute names are arbitrary placeholders and that all meaning should be deferred to predicates... but hey that really is another topic.
as to the point that
17
is 10001 in binary and 11 in hex. Yes that is true. Still I equate representation with value, so for me "1001 in binary" would be a value such as10001b
and "11 in hex" would be a sayx11
. Then you could have records such as{ "Equivalent Thing":10001b, "Thing":17 } , { "Equivalent Thing":x11 , "Thing":17 }I.e at best
1001b
and17
are equivalent (for some definition of equivalent). What they most certainly are not is the same. Not the same, not equal.Connected as >> equivalent >>, not equal or identical.
Like
17
,seventeen
,Seventeen
,dix-sept
,XVII
,MIG 17
all can be considered equivalent to some degree (well, I'ld argue aboutMIG 17
), but certainly none are the same value - if you equate representation with value, which I do and I think is the (only) sensible way to go about thingsWhich is what SDRs seem to do better than anything, hence my interest in them. In this approach value is just one of many labels.
Quote from Paul Vernon on December 3, 2021, 10:40 amQuote from dandl on December 3, 2021, 12:23 am17 and 24 are not values of a type, they are entities in their own right with their own attributes and features.
Yes. That has been my point in the Which type? topic.
17
is a value full stop. It does not "carry with it", even conceptually, some identification of the type to which it belongs.
Sorry, but not even close. In this thought experiment 17 is a label, and 'value' is one of the things it is associated with in our minds. It could be a GUID, or it could be a Sparse Data Representation see eg https://en.wikipedia.org/wiki/Sparse_approximation. It carries nothing around with it; all its attributes are encoded as connections to other SDRs.
It is not the type of a value that is important, it is the name we associate with a value that gives its meaning. So, using ordered pairs (i.e. attributes - which I write as
name:value
), your examples above (well some of) become
Again no, not even close. These attributes should all be read as sentences, linking together multiple labels in a chain to form the overall concept.
17 >> age >> years >> range >> oldest >> person >> car driver >> not
Like I said: this is not computing as we know it, but it could be how brains work (higher animals as well as ours).
Well actually I would like a system that can (automatically) recognise that
{"oldest age you can be and can't vote":17 years}
and{"oldest age in years you can be and can't vote":17}
are (to a greater or lesser extent) equivalent.
As it happens, that's exactly the thing that SDRs do well.
Which is another way of me saying that I don't agree with the position that attribute names are arbitrary placeholders and that all meaning should be deferred to predicates... but hey that really is another topic.
as to the point that
17
is 10001 in binary and 11 in hex. Yes that is true. Still I equate representation with value, so for me "1001 in binary" would be a value such as10001b
and "11 in hex" would be a sayx11
. Then you could have records such as{ "Equivalent Thing":10001b, "Thing":17 } , { "Equivalent Thing":x11 , "Thing":17 }I.e at best
1001b
and17
are equivalent (for some definition of equivalent). What they most certainly are not is the same. Not the same, not equal.
Connected as >> equivalent >>, not equal or identical.
Like
17
,seventeen
,Seventeen
,dix-sept
,XVII
,MIG 17
all can be considered equivalent to some degree (well, I'ld argue aboutMIG 17
), but certainly none are the same value - if you equate representation with value, which I do and I think is the (only) sensible way to go about things
Which is what SDRs seem to do better than anything, hence my interest in them. In this approach value is just one of many labels.
Quote from Dave Voorhis on December 4, 2021, 12:26 amQuote from dandl on December 3, 2021, 11:59 pmQuote from Paul Vernon on December 3, 2021, 10:40 amQuote from dandl on December 3, 2021, 12:23 am17 and 24 are not values of a type, they are entities in their own right with their own attributes and features.
Yes. That has been my point in the Which type? topic.
17
is a value full stop. It does not "carry with it", even conceptually, some identification of the type to which it belongs.Sorry, but not even close. In this thought experiment 17 is a label, and 'value' is one of the things it is associated with in our minds. It could be a GUID, or it could be a Sparse Data Representation see eg https://en.wikipedia.org/wiki/Sparse_approximation. It carries nothing around with it; all its attributes are encoded as connections to other SDRs.
It is not the type of a value that is important, it is the name we associate with a value that gives its meaning. So, using ordered pairs (i.e. attributes - which I write as
name:value
), your examples above (well some of) becomeAgain no, not even close. These attributes should all be read as sentences, linking together multiple labels in a chain to form the overall concept.
17 >> age >> years >> range >> oldest >> person >> car driver >> not
Like I said: this is not computing as we know it, but it could be how brains work (higher animals as well as ours).
<aside>
That reminds me a bit of Cyc (see https://en.wikipedia.org/wiki/Cyc)
Some years ago, I used OpenCyc in a research project that for a couple of years was the basis for a public Web site's semantic search feature. Like much of Artificial Intelligence (I use the term loosely) R&D, on some things it was so good it was creepy; on other things, hopelessly bad.
</aside>
Quote from dandl on December 3, 2021, 11:59 pmQuote from Paul Vernon on December 3, 2021, 10:40 amQuote from dandl on December 3, 2021, 12:23 am17 and 24 are not values of a type, they are entities in their own right with their own attributes and features.
Yes. That has been my point in the Which type? topic.
17
is a value full stop. It does not "carry with it", even conceptually, some identification of the type to which it belongs.Sorry, but not even close. In this thought experiment 17 is a label, and 'value' is one of the things it is associated with in our minds. It could be a GUID, or it could be a Sparse Data Representation see eg https://en.wikipedia.org/wiki/Sparse_approximation. It carries nothing around with it; all its attributes are encoded as connections to other SDRs.
It is not the type of a value that is important, it is the name we associate with a value that gives its meaning. So, using ordered pairs (i.e. attributes - which I write as
name:value
), your examples above (well some of) becomeAgain no, not even close. These attributes should all be read as sentences, linking together multiple labels in a chain to form the overall concept.
17 >> age >> years >> range >> oldest >> person >> car driver >> not
Like I said: this is not computing as we know it, but it could be how brains work (higher animals as well as ours).
<aside>
That reminds me a bit of Cyc (see https://en.wikipedia.org/wiki/Cyc)
Some years ago, I used OpenCyc in a research project that for a couple of years was the basis for a public Web site's semantic search feature. Like much of Artificial Intelligence (I use the term loosely) R&D, on some things it was so good it was creepy; on other things, hopelessly bad.
</aside>
Quote from Paul Vernon on December 4, 2021, 12:20 pmQuote from Erwin on December 3, 2021, 11:49 pmWell, somehow my gut feel says "we don't need types, we just need individuals" is what COBOL is based on (and perhaps vilainously, saying the same in java would look like be 'we don't need classes, we just need Object'). "There's only numbers and text" - that sort of stuff. Every modern language goes beyond that because there is ***value*** to be had in going there. Just seems like nobody knows how far beyond is "too far".
My first job after university was COBOL and non-relational database programming... maybe there is something in that.
Still, I certainly think there is huge value in going beyond "numbers and text". I wonder if it not a question of "how far" to go but rather of "how" to go.
To take a quick example. If I have an
AccountStatus
attribute name, and I want to have say "open", "closed" as the valid attribute values, how do I decide if I need a new type and/or new values? I.e. can I just use the (already existing) string values"open"
and"closed"
or do I need to cut new values -open
andclosed
say - and (maybe) give them a type such asStatus
or (AccountStatus
) (or is that,"Status"
or"AccountStatus"
?)Now I certainly say that the concept of an account status (the status of an account) is (some sort of) a thing. It might be one thing, or a "status" thing combined with an "account" thing, and so allowing that "status" might be a thing that is validly combined with other things.
So, it is about how we capture the "thingness" of "account status". Is that via a new or existing type, via new or existing values? That is the hard bit to decide. "how" not "how far".
That the "individual 17 is not something that can be a temperature" is sort of what I already hinted at, by hinting that "making it a member of the set of possible temperatures" is in fact an act of adding the semantics "we're talking about temperatures here" to the "bare" individual. So in a sense you seem to agree except perhaps on the wording needed to express it.
Yes, maybe I would agree. Let me try this wording:
I would say that if you "add semantics to a "bare" individual" you have a new individual. IOW you can't "update" or "add to" a value (only variables). If you really do want to "add to" an individual, you do that via collecting that individual and another individual into a set (or into an ordered pair (or possibly a triple etc)).
I.e.(for me) individuals (aka atoms aka urelements) are axiomatically declared in the model. They "just exist". Now sure, outside of the model, in the implementation say you might store
17
and17 K
using the same sequence of bits, and then just some extra data to say that "this memory/storage location holds a number" and "this location hold a temperature" (and temperature is defined - following SI - as a quantity of Kelvin). Also outside of the model is the question of how to "construct" new individuals/atoms (if such a thing is actually needed). Such construction might be done within a very similar model - i.e. say via using set values to construct new values that can then be "mapped" to new literals for use inside the model - but it would not be done within the model.BTW I don't think this is all that much different from TTM except that talks more about user-defined scalar types being implemented outside of the model, rather than user-defined scalar values per se.
Quote from Erwin on December 3, 2021, 11:49 pm
Well, somehow my gut feel says "we don't need types, we just need individuals" is what COBOL is based on (and perhaps vilainously, saying the same in java would look like be 'we don't need classes, we just need Object'). "There's only numbers and text" - that sort of stuff. Every modern language goes beyond that because there is ***value*** to be had in going there. Just seems like nobody knows how far beyond is "too far".
My first job after university was COBOL and non-relational database programming... maybe there is something in that.
Still, I certainly think there is huge value in going beyond "numbers and text". I wonder if it not a question of "how far" to go but rather of "how" to go.
To take a quick example. If I have an AccountStatus
attribute name, and I want to have say "open", "closed" as the valid attribute values, how do I decide if I need a new type and/or new values? I.e. can I just use the (already existing) string values "open"
and "closed"
or do I need to cut new values - open
and closed
say - and (maybe) give them a type such as Status
or (AccountStatus
) (or is that, "Status"
or "AccountStatus"
?)
Now I certainly say that the concept of an account status (the status of an account) is (some sort of) a thing. It might be one thing, or a "status" thing combined with an "account" thing, and so allowing that "status" might be a thing that is validly combined with other things.
So, it is about how we capture the "thingness" of "account status". Is that via a new or existing type, via new or existing values? That is the hard bit to decide. "how" not "how far".
That the "individual 17 is not something that can be a temperature" is sort of what I already hinted at, by hinting that "making it a member of the set of possible temperatures" is in fact an act of adding the semantics "we're talking about temperatures here" to the "bare" individual. So in a sense you seem to agree except perhaps on the wording needed to express it.
Yes, maybe I would agree. Let me try this wording:
I would say that if you "add semantics to a "bare" individual" you have a new individual. IOW you can't "update" or "add to" a value (only variables). If you really do want to "add to" an individual, you do that via collecting that individual and another individual into a set (or into an ordered pair (or possibly a triple etc)).
I.e.(for me) individuals (aka atoms aka urelements) are axiomatically declared in the model. They "just exist". Now sure, outside of the model, in the implementation say you might store 17
and 17 K
using the same sequence of bits, and then just some extra data to say that "this memory/storage location holds a number" and "this location hold a temperature" (and temperature is defined - following SI - as a quantity of Kelvin). Also outside of the model is the question of how to "construct" new individuals/atoms (if such a thing is actually needed). Such construction might be done within a very similar model - i.e. say via using set values to construct new values that can then be "mapped" to new literals for use inside the model - but it would not be done within the model.
BTW I don't think this is all that much different from TTM except that talks more about user-defined scalar types being implemented outside of the model, rather than user-defined scalar values per se.
Quote from Paul Vernon on December 4, 2021, 12:53 pmQuote from Dave Voorhis on December 4, 2021, 12:26 am<aside>
That reminds me a bit of Cyc (see https://en.wikipedia.org/wiki/Cyc)
Some years ago, I used OpenCyc in a research project that for a couple of years was the basis for a public Web site's semantic search feature. Like much of Artificial Intelligence (I use the term loosely) R&D, on some things it was so good it was creepy; on other things, hopelessly bad.
</aside>
Dave. I'm not sure how you got from "SDR" to Cyc, but that is a good link.
@David, do you have a link specific to Sparse Data Representation? The https://en.wikipedia.org/wiki/Sparse_approximation page did not do much for me on first glance.
Quote from Dave Voorhis on December 4, 2021, 12:26 am
<aside>
That reminds me a bit of Cyc (see https://en.wikipedia.org/wiki/Cyc)
Some years ago, I used OpenCyc in a research project that for a couple of years was the basis for a public Web site's semantic search feature. Like much of Artificial Intelligence (I use the term loosely) R&D, on some things it was so good it was creepy; on other things, hopelessly bad.
</aside>
Dave. I'm not sure how you got from "SDR" to Cyc, but that is a good link.
@David, do you have a link specific to Sparse Data Representation? The https://en.wikipedia.org/wiki/Sparse_approximation page did not do much for me on first glance.
Quote from Dave Voorhis on December 4, 2021, 1:03 pmQuote from Paul Vernon on December 4, 2021, 12:53 pmQuote from Dave Voorhis on December 4, 2021, 12:26 am<aside>
That reminds me a bit of Cyc (see https://en.wikipedia.org/wiki/Cyc)
Some years ago, I used OpenCyc in a research project that for a couple of years was the basis for a public Web site's semantic search feature. Like much of Artificial Intelligence (I use the term loosely) R&D, on some things it was so good it was creepy; on other things, hopelessly bad.
</aside>
Dave. I'm not sure how you got from "SDR" to Cyc, but that is a good link.
It wasn't "SDR", but dandl's mention of concepts and speculation about how brains work, and this:
17 >> age >> years >> range >> oldest >> person >> car driver >> not
It all (distantly, perhaps) reminded me of the semantic links in Cyc.
Quote from Paul Vernon on December 4, 2021, 12:53 pmQuote from Dave Voorhis on December 4, 2021, 12:26 am<aside>
That reminds me a bit of Cyc (see https://en.wikipedia.org/wiki/Cyc)
Some years ago, I used OpenCyc in a research project that for a couple of years was the basis for a public Web site's semantic search feature. Like much of Artificial Intelligence (I use the term loosely) R&D, on some things it was so good it was creepy; on other things, hopelessly bad.
</aside>
Dave. I'm not sure how you got from "SDR" to Cyc, but that is a good link.
It wasn't "SDR", but dandl's mention of concepts and speculation about how brains work, and this:
17 >> age >> years >> range >> oldest >> person >> car driver >> not
It all (distantly, perhaps) reminded me of the semantic links in Cyc.
Quote from Erwin on December 4, 2021, 9:39 pmQuote from Paul Vernon on December 4, 2021, 12:20 pmI would say that if you "add semantics to a "bare" individual" you have a new individual.
I would say this is not respecting the distinction that DBE was trying to make between 'individuals' and 'values' : I would say that any act of "adding semantics" (in the sense as it arose in this discussion) is precisely what gets you from 'individual' to 'value', therefore the result of that act can ***never*** be "a new individual" (nor should it). Even if the semantics added are as "superficial" as "we're talking just any number here" as opposed to "we're talking squares of weights here".
Quote from Paul Vernon on December 4, 2021, 12:20 pmI would say that if you "add semantics to a "bare" individual" you have a new individual.
I would say this is not respecting the distinction that DBE was trying to make between 'individuals' and 'values' : I would say that any act of "adding semantics" (in the sense as it arose in this discussion) is precisely what gets you from 'individual' to 'value', therefore the result of that act can ***never*** be "a new individual" (nor should it). Even if the semantics added are as "superficial" as "we're talking just any number here" as opposed to "we're talking squares of weights here".
Quote from Erwin on December 4, 2021, 10:26 pmQuote from Paul Vernon on December 4, 2021, 12:20 pmIf I have an
AccountStatus
attribute name, and I want to have say "open", "closed" as the valid attribute values, how do I decide if I need a new type and/or new values? I.e. can I just use the (already existing) string values"open"
and"closed"
or do I need to cut new values -open
andclosed
say - and (maybe) give them a type such asStatus
or (AccountStatus
)More research is needed. ( :-) )
Mathematically, any type consisting of two values is isomorphic to BOOLEAN. Meaning you can always just use type BOOLEAN instead without loss of whatever.
And then there's the fact that a relation schema that has a boolean attribute X is provably information-equivalent to a design with two relation schema's that both have all the attributes of the single relation schema except X (OPEN_ACCOUNTS and CLOSED_ACCOUNTS, say). (As long as : if X does not participate in all keys of its relation, then an empty-intersection constraint between the two alternative relvars must also be declared.)
And then there's the fact that "status" codes are also often used to [try and] capture "workflow progress information", with status codes like "completeness verification pending", "completeness verified, acquisition proposal pending", "acquisition proposal finalised, director approval pending" etc. etc. etc. I strongly feel that this is mis-design because of this personal conviction I developed decades ago that "codes are poor, entities are rich". Under that dictum, "codes" are ***ALWAYS*** a synthetic way of expressing whether a certain kind of event has happened or not. (And it's the 'synthetic' portion that accounts for "codes" being "poor" in informational value : the "code" can ***never*** tell you things like "when did the event happen" (marriage, divorce, death, account closure, account opening, ...) or "who made the event happen" (who authored the acquisition proposal, who approved it, ...). Thanks to Hugh and Chris, I now know what I really meant at that time was "codes are poor, ***relations*** are rich".
And ***especially*** in the particular application of "codes" to model "workflow progress" : workflows are petri-nets, and petri-nets [used to represent workflows] are directed graphs, and the number of code values needed is equal to the number of possible sets-of-walked-through-nodes in that graph. And if the workflow changes over time, this ***always*** impacts the set of possible code values, thus (in TTM terms) the type of the code changes (***) upon any change in workflow, and that's a thorny thing to deal with even in relational databases.
So I have felt for a very long time that replacing that one single "worflow progress status code" with as many relvars as there are steps in the workflow to express "this step has been done (plus when and by whom)" is the way to go. But I didn't want to be declared even more nuts than people were already doing at the time, so I've always shut up about it - until now.
(***) this is (the same sloppy) shorthand (used by anyone else in the industry too) to express the fact that the ***type declaration*** of the attribute at hand must be changed to some other type that will allow us to express all possible "states" (= sets of possible walked-through nodes in the workflow) of the ***new*** workflow (and we must ***also*** figure out the details (ALL the details) of the conversion procedure). Per TTM's axiom that "types aren't variables" (and therefore don't ever "change"), this is the only way of expressing it precisely, but, well, you see how long the sentence gets.
Quote from Paul Vernon on December 4, 2021, 12:20 pmIf I have an
AccountStatus
attribute name, and I want to have say "open", "closed" as the valid attribute values, how do I decide if I need a new type and/or new values? I.e. can I just use the (already existing) string values"open"
and"closed"
or do I need to cut new values -open
andclosed
say - and (maybe) give them a type such asStatus
or (AccountStatus
)
More research is needed. ( :-) )
Mathematically, any type consisting of two values is isomorphic to BOOLEAN. Meaning you can always just use type BOOLEAN instead without loss of whatever.
And then there's the fact that a relation schema that has a boolean attribute X is provably information-equivalent to a design with two relation schema's that both have all the attributes of the single relation schema except X (OPEN_ACCOUNTS and CLOSED_ACCOUNTS, say). (As long as : if X does not participate in all keys of its relation, then an empty-intersection constraint between the two alternative relvars must also be declared.)
And then there's the fact that "status" codes are also often used to [try and] capture "workflow progress information", with status codes like "completeness verification pending", "completeness verified, acquisition proposal pending", "acquisition proposal finalised, director approval pending" etc. etc. etc. I strongly feel that this is mis-design because of this personal conviction I developed decades ago that "codes are poor, entities are rich". Under that dictum, "codes" are ***ALWAYS*** a synthetic way of expressing whether a certain kind of event has happened or not. (And it's the 'synthetic' portion that accounts for "codes" being "poor" in informational value : the "code" can ***never*** tell you things like "when did the event happen" (marriage, divorce, death, account closure, account opening, ...) or "who made the event happen" (who authored the acquisition proposal, who approved it, ...). Thanks to Hugh and Chris, I now know what I really meant at that time was "codes are poor, ***relations*** are rich".
And ***especially*** in the particular application of "codes" to model "workflow progress" : workflows are petri-nets, and petri-nets [used to represent workflows] are directed graphs, and the number of code values needed is equal to the number of possible sets-of-walked-through-nodes in that graph. And if the workflow changes over time, this ***always*** impacts the set of possible code values, thus (in TTM terms) the type of the code changes (***) upon any change in workflow, and that's a thorny thing to deal with even in relational databases.
So I have felt for a very long time that replacing that one single "worflow progress status code" with as many relvars as there are steps in the workflow to express "this step has been done (plus when and by whom)" is the way to go. But I didn't want to be declared even more nuts than people were already doing at the time, so I've always shut up about it - until now.
(***) this is (the same sloppy) shorthand (used by anyone else in the industry too) to express the fact that the ***type declaration*** of the attribute at hand must be changed to some other type that will allow us to express all possible "states" (= sets of possible walked-through nodes in the workflow) of the ***new*** workflow (and we must ***also*** figure out the details (ALL the details) of the conversion procedure). Per TTM's axiom that "types aren't variables" (and therefore don't ever "change"), this is the only way of expressing it precisely, but, well, you see how long the sentence gets.