Life after D with Safe Java
Quote from Erwin on April 17, 2021, 9:16 pmQuote from Dave Voorhis on April 17, 2021, 7:47 pmQuote from Erwin on April 17, 2021, 6:43 pmQuote from Dave Voorhis on April 17, 2021, 6:19 pmQuote from Erwin on April 17, 2021, 1:28 pmQuote from Dave Voorhis on April 17, 2021, 9:55 amYes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
Just out of curiosity. Could you comment on the following techniques for achieving the same thing :
Tech. 1
class Sup {abstract boolean isBar();}class Sub1 extends Sup {boolean isBar() {return true;}}class Sub2 extends Sup {boolean isBar() {return false;}}class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }
Tech 2
interface Bar {...}class Sup {final boolean isBar() {return this instanceof Bar;}}class Sub1 extends Sup implements Bar {...}class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}Tech 3
@interface Bar {...}class Sup {final boolean isBar() {return this.getClass().getAnnotation(Bar.class) != null;}}@Barclass Sub1 extends Sup {...}class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}Tech 4
class Sup {...}class Bar {...}class Sub1 extends Sup, Bar {...} /* not valid java of course */class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}
Comment in particular on how to also keep using the same technique if there is a requirement to apply the technique for >1 characteristic. That is, what in technique 4 would yield something like
class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */Sorry, I'm not clear what you're asking here. What is "achieving the same thing" meant to reference?
interface and @interface aren't the same thing (the former declares an interface, the latter declares an annotation) but I presume that's not what you meant.
Technique 1 answers a boolean question by obliging the developer to implement a method returning a constant (or if run-time state actually determines the answer, then the result of evaluating an expression, but in that case the other techniques aren't really available).
Technique 2 answers that same question (semantic p-o-v) by making the developer implement the appropriate interface that (in a manner of speaking) "makes it so" (and nothing further to bother with implementing methods).
Technique 3 answers that same question by making the developer provide the appropriate annotation that (in same manner of speaking) "makes it so" (and once again nothing further to bother with implementing methods).
Technique 4 answers that same question by making the developer fit his class properly into the class structure, which is no longer strictly hierarchical, which is not available in java, but might be in other languages, and since the question is more about upsides and downsides of each of the techniques per se, I mention it nonetheless.
In particular I was wondering/probing whether technique 3 is one of your "abuses" or not but I could just as well broaden the question a bit. Clearer ?
Not really. I don't understand what is meant by "the same question" in each example. I suppose Technique 1 and 2 could be notionally considered "the same question" but Technique 3 isn't. I don't consider @interface to be a form of interface.
In the examples, the "same question" is "Are you a Bar ?". But it could also be "What is the name of the initial Presentation you want to show ?". Anything, really.
Quote from Dave Voorhis on April 17, 2021, 7:47 pmQuote from Erwin on April 17, 2021, 6:43 pmQuote from Dave Voorhis on April 17, 2021, 6:19 pmQuote from Erwin on April 17, 2021, 1:28 pmQuote from Dave Voorhis on April 17, 2021, 9:55 amYes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
Just out of curiosity. Could you comment on the following techniques for achieving the same thing :
Tech. 1
class Sup {abstract boolean isBar();}class Sub1 extends Sup {boolean isBar() {return true;}}class Sub2 extends Sup {boolean isBar() {return false;}}class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }
Tech 2
interface Bar {...}class Sup {final boolean isBar() {return this instanceof Bar;}}class Sub1 extends Sup implements Bar {...}class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}Tech 3
@interface Bar {...}class Sup {final boolean isBar() {return this.getClass().getAnnotation(Bar.class) != null;}}@Barclass Sub1 extends Sup {...}class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}Tech 4
class Sup {...}class Bar {...}class Sub1 extends Sup, Bar {...} /* not valid java of course */class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}
Comment in particular on how to also keep using the same technique if there is a requirement to apply the technique for >1 characteristic. That is, what in technique 4 would yield something like
class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */Sorry, I'm not clear what you're asking here. What is "achieving the same thing" meant to reference?
interface and @interface aren't the same thing (the former declares an interface, the latter declares an annotation) but I presume that's not what you meant.
Technique 1 answers a boolean question by obliging the developer to implement a method returning a constant (or if run-time state actually determines the answer, then the result of evaluating an expression, but in that case the other techniques aren't really available).
Technique 2 answers that same question (semantic p-o-v) by making the developer implement the appropriate interface that (in a manner of speaking) "makes it so" (and nothing further to bother with implementing methods).
Technique 3 answers that same question by making the developer provide the appropriate annotation that (in same manner of speaking) "makes it so" (and once again nothing further to bother with implementing methods).
Technique 4 answers that same question by making the developer fit his class properly into the class structure, which is no longer strictly hierarchical, which is not available in java, but might be in other languages, and since the question is more about upsides and downsides of each of the techniques per se, I mention it nonetheless.
In particular I was wondering/probing whether technique 3 is one of your "abuses" or not but I could just as well broaden the question a bit. Clearer ?
Not really. I don't understand what is meant by "the same question" in each example. I suppose Technique 1 and 2 could be notionally considered "the same question" but Technique 3 isn't. I don't consider @interface to be a form of interface.
In the examples, the "same question" is "Are you a Bar ?". But it could also be "What is the name of the initial Presentation you want to show ?". Anything, really.
Quote from Dave Voorhis on April 17, 2021, 10:08 pmQuote from Erwin on April 17, 2021, 9:16 pmQuote from Dave Voorhis on April 17, 2021, 7:47 pmQuote from Erwin on April 17, 2021, 6:43 pmQuote from Dave Voorhis on April 17, 2021, 6:19 pmQuote from Erwin on April 17, 2021, 1:28 pmQuote from Dave Voorhis on April 17, 2021, 9:55 amYes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
Just out of curiosity. Could you comment on the following techniques for achieving the same thing :
Tech. 1
class Sup {abstract boolean isBar();}class Sub1 extends Sup {boolean isBar() {return true;}}class Sub2 extends Sup {boolean isBar() {return false;}}class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }
Tech 2
interface Bar {...}class Sup {final boolean isBar() {return this instanceof Bar;}}class Sub1 extends Sup implements Bar {...}class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}Tech 3
@interface Bar {...}class Sup {final boolean isBar() {return this.getClass().getAnnotation(Bar.class) != null;}}@Barclass Sub1 extends Sup {...}class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}Tech 4
class Sup {...}class Bar {...}class Sub1 extends Sup, Bar {...} /* not valid java of course */class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}
Comment in particular on how to also keep using the same technique if there is a requirement to apply the technique for >1 characteristic. That is, what in technique 4 would yield something like
class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */Sorry, I'm not clear what you're asking here. What is "achieving the same thing" meant to reference?
interface and @interface aren't the same thing (the former declares an interface, the latter declares an annotation) but I presume that's not what you meant.
Technique 1 answers a boolean question by obliging the developer to implement a method returning a constant (or if run-time state actually determines the answer, then the result of evaluating an expression, but in that case the other techniques aren't really available).
Technique 2 answers that same question (semantic p-o-v) by making the developer implement the appropriate interface that (in a manner of speaking) "makes it so" (and nothing further to bother with implementing methods).
Technique 3 answers that same question by making the developer provide the appropriate annotation that (in same manner of speaking) "makes it so" (and once again nothing further to bother with implementing methods).
Technique 4 answers that same question by making the developer fit his class properly into the class structure, which is no longer strictly hierarchical, which is not available in java, but might be in other languages, and since the question is more about upsides and downsides of each of the techniques per se, I mention it nonetheless.
In particular I was wondering/probing whether technique 3 is one of your "abuses" or not but I could just as well broaden the question a bit. Clearer ?
Not really. I don't understand what is meant by "the same question" in each example. I suppose Technique 1 and 2 could be notionally considered "the same question" but Technique 3 isn't. I don't consider @interface to be a form of interface.
In the examples, the "same question" is "Are you a Bar ?". But it could also be "What is the name of the initial Presentation you want to show ?". Anything, really.
Ok, at the point of my previous response I kind of thought I was beginning to get what you were asking, sort of.
But with your response, despite writing Java code pretty much every day for nearly 25 years and noting that your examples are clear and straightforward...
I have no idea what you're asking.
I must be missing something obvious.
Quote from Erwin on April 17, 2021, 9:16 pmQuote from Dave Voorhis on April 17, 2021, 7:47 pmQuote from Erwin on April 17, 2021, 6:43 pmQuote from Dave Voorhis on April 17, 2021, 6:19 pmQuote from Erwin on April 17, 2021, 1:28 pmQuote from Dave Voorhis on April 17, 2021, 9:55 amYes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
Just out of curiosity. Could you comment on the following techniques for achieving the same thing :
Tech. 1
class Sup {abstract boolean isBar();}class Sub1 extends Sup {boolean isBar() {return true;}}class Sub2 extends Sup {boolean isBar() {return false;}}class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }class Sup { abstract boolean isBar(); } class Sub1 extends Sup { boolean isBar() { return true; } } class Sub2 extends Sup { boolean isBar() { return false; } }
Tech 2
interface Bar {...}class Sup {final boolean isBar() {return this instanceof Bar;}}class Sub1 extends Sup implements Bar {...}class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}interface Bar {...} class Sup { final boolean isBar() { return this instanceof Bar; } } class Sub1 extends Sup implements Bar {...} class Sub2 extends Sup {...}Tech 3
@interface Bar {...}class Sup {final boolean isBar() {return this.getClass().getAnnotation(Bar.class) != null;}}@Barclass Sub1 extends Sup {...}class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}@interface Bar {...} class Sup { final boolean isBar() { return this.getClass().getAnnotation(Bar.class) != null; } } @Bar class Sub1 extends Sup {...} class Sub2 extends Sup {...}Tech 4
class Sup {...}class Bar {...}class Sub1 extends Sup, Bar {...} /* not valid java of course */class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}class Sup {...} class Bar {...} class Sub1 extends Sup, Bar {...} /* not valid java of course */ class Sub2 extends Sup {...}
Comment in particular on how to also keep using the same technique if there is a requirement to apply the technique for >1 characteristic. That is, what in technique 4 would yield something like
class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */class Sub1 extends Sup, Bar, Foo, ... {...} /* still not valid java of course */Sorry, I'm not clear what you're asking here. What is "achieving the same thing" meant to reference?
interface and @interface aren't the same thing (the former declares an interface, the latter declares an annotation) but I presume that's not what you meant.
Technique 1 answers a boolean question by obliging the developer to implement a method returning a constant (or if run-time state actually determines the answer, then the result of evaluating an expression, but in that case the other techniques aren't really available).
Technique 2 answers that same question (semantic p-o-v) by making the developer implement the appropriate interface that (in a manner of speaking) "makes it so" (and nothing further to bother with implementing methods).
Technique 3 answers that same question by making the developer provide the appropriate annotation that (in same manner of speaking) "makes it so" (and once again nothing further to bother with implementing methods).
Technique 4 answers that same question by making the developer fit his class properly into the class structure, which is no longer strictly hierarchical, which is not available in java, but might be in other languages, and since the question is more about upsides and downsides of each of the techniques per se, I mention it nonetheless.
In particular I was wondering/probing whether technique 3 is one of your "abuses" or not but I could just as well broaden the question a bit. Clearer ?
Not really. I don't understand what is meant by "the same question" in each example. I suppose Technique 1 and 2 could be notionally considered "the same question" but Technique 3 isn't. I don't consider @interface to be a form of interface.
In the examples, the "same question" is "Are you a Bar ?". But it could also be "What is the name of the initial Presentation you want to show ?". Anything, really.
Ok, at the point of my previous response I kind of thought I was beginning to get what you were asking, sort of.
But with your response, despite writing Java code pretty much every day for nearly 25 years and noting that your examples are clear and straightforward...
I have no idea what you're asking.
I must be missing something obvious.
Quote from dandl on April 18, 2021, 2:55 amQuote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
I marvel at how your suggestions are often almost diametrically the opposite of it.
The opposite? Really?
- No meta-programming, so Java loses annotations.
Yes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
No argument, but underlying this is a key problem of pushing work the compiler should do onto the runtime: annotations/attributes, reflection, casts, etc. There must be a better way to get more mileage out of the compiler.
- And unsafer-lower-longer sounds like you want pointers, malloc, stack faults, lots of undefined behaviour. I have news: it's been done. It's called C. I don't think you want it.
No, obviously not, and intentional knee-jerk mischaracterisations/misinterpretations don't help.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step? Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Quote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
I marvel at how your suggestions are often almost diametrically the opposite of it.
The opposite? Really?
- No meta-programming, so Java loses annotations.
Yes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
No argument, but underlying this is a key problem of pushing work the compiler should do onto the runtime: annotations/attributes, reflection, casts, etc. There must be a better way to get more mileage out of the compiler.
- And unsafer-lower-longer sounds like you want pointers, malloc, stack faults, lots of undefined behaviour. I have news: it's been done. It's called C. I don't think you want it.
No, obviously not, and intentional knee-jerk mischaracterisations/misinterpretations don't help.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step? Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Quote from Dave Voorhis on April 18, 2021, 11:52 amQuote from dandl on April 18, 2021, 2:55 amQuote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
Well, yes. That's indeed partly what started the recent spate of forum activity -- talking about what might make an ideal language to host development of a D, or even better, could be a full D by merely writing some (must be statically typed!) "relational algebra" libraries, rather than extension, generation, only run-time safety, or employing whole new interpreters/compilers.
But that said, for the majority of ordinary (for some undefined value of "ordinary") day-to-day 3GL-level work, Java is fine. Not as safe or elegant as Kotlin, say, or (e.g.) as feature-rich as C#, but certainly fine for typical 3GL jobs. Definitely much superior to C++. That's why Java is heavily used and why alternatives like Kotlin, Scala, Groovy (ugh!), even C# or Python, haven't even started to replace it.
I marvel at how your suggestions are often almost diametrically the opposite of it.
The opposite? Really?
- No meta-programming, so Java loses annotations.
Yes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
No argument, but underlying this is a key problem of pushing work the compiler should do onto the runtime: annotations/attributes, reflection, casts, etc. There must be a better way to get more mileage out of the compiler.
- And unsafer-lower-longer sounds like you want pointers, malloc, stack faults, lots of undefined behaviour. I have news: it's been done. It's called C. I don't think you want it.
No, obviously not, and intentional knee-jerk mischaracterisations/misinterpretations don't help.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
That will cover the majority of application programming. For low-level close(r)-to-the-metal (however you define it) 3GL-is-the-right-level tool/systems development, we have Rust, Kotlin, and almost certainly continued use of C#, Java and even C and C++.
Where these are the wrong tools for the job, obviously they are unsuited, unsafe, unduly verbose, inexpressive, whatever.
Where they are the right tools for the job, they suit it well.
Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
The main thing it lost was high-performance function pointers. There are workarounds, but they're slow. Though I've been meaning to experiment with lambdas to see if they can effectively (in terms of performance) be used instead.
I've not encountered any particular limitation due to missing immutable types, type aliases, text macros, etc.
Missing conveniences need to be distinguished from absent necessities.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Yes, I really see Java going on for the next 50 years, much as C has been going on for the last 50 years.
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
Quote from dandl on April 18, 2021, 2:55 amQuote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
Well, yes. That's indeed partly what started the recent spate of forum activity -- talking about what might make an ideal language to host development of a D, or even better, could be a full D by merely writing some (must be statically typed!) "relational algebra" libraries, rather than extension, generation, only run-time safety, or employing whole new interpreters/compilers.
But that said, for the majority of ordinary (for some undefined value of "ordinary") day-to-day 3GL-level work, Java is fine. Not as safe or elegant as Kotlin, say, or (e.g.) as feature-rich as C#, but certainly fine for typical 3GL jobs. Definitely much superior to C++. That's why Java is heavily used and why alternatives like Kotlin, Scala, Groovy (ugh!), even C# or Python, haven't even started to replace it.
I marvel at how your suggestions are often almost diametrically the opposite of it.
The opposite? Really?
- No meta-programming, so Java loses annotations.
Yes, annotations are mostly dire. They've sometimes been used to good effect -- e.g., to indicate special methods like "this is a unit test" without relying on language extension -- but all too often are used to create non-programmable, unreadable, unmaintainable, difficult-to-debug half-attempts at domain-specific-languages, typically when they're not required at all. A simple, straightforward library of classes would often have been better, but, alas, annotations are/were fashionable.
I'm seeing growing pushback against such annotation abuse, though, which is good.
No argument, but underlying this is a key problem of pushing work the compiler should do onto the runtime: annotations/attributes, reflection, casts, etc. There must be a better way to get more mileage out of the compiler.
- And unsafer-lower-longer sounds like you want pointers, malloc, stack faults, lots of undefined behaviour. I have news: it's been done. It's called C. I don't think you want it.
No, obviously not, and intentional knee-jerk mischaracterisations/misinterpretations don't help.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
That will cover the majority of application programming. For low-level close(r)-to-the-metal (however you define it) 3GL-is-the-right-level tool/systems development, we have Rust, Kotlin, and almost certainly continued use of C#, Java and even C and C++.
Where these are the wrong tools for the job, obviously they are unsuited, unsafe, unduly verbose, inexpressive, whatever.
Where they are the right tools for the job, they suit it well.
Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
The main thing it lost was high-performance function pointers. There are workarounds, but they're slow. Though I've been meaning to experiment with lambdas to see if they can effectively (in terms of performance) be used instead.
I've not encountered any particular limitation due to missing immutable types, type aliases, text macros, etc.
Missing conveniences need to be distinguished from absent necessities.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Yes, I really see Java going on for the next 50 years, much as C has been going on for the last 50 years.
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
Quote from Erwin on April 18, 2021, 11:55 amQuote from Dave Voorhis on April 17, 2021, 10:08 pmOk, at the point of my previous response I kind of thought I was beginning to get what you were asking, sort of.
But with your response, despite writing Java code pretty much every day for nearly 25 years and noting that your examples are clear and straightforward...
I have no idea what you're asking.
I must be missing something obvious.
You said "annotations are mostly dire" and called them "abuse". Is technique 3 "dire" and "abuse" ? To me, it seems a valid way to avoid techniques that might otherwise be regarded as "too much boilerplate". dandl's intended meaning where he says "shorter", afaict. So if your answer is "yes" then my next question is "why" and if it is "no" then my next question is "then what does an abusive use look like".
(Remember I was asking just out of curiosity.)
Quote from Dave Voorhis on April 17, 2021, 10:08 pmOk, at the point of my previous response I kind of thought I was beginning to get what you were asking, sort of.
But with your response, despite writing Java code pretty much every day for nearly 25 years and noting that your examples are clear and straightforward...
I have no idea what you're asking.
I must be missing something obvious.
You said "annotations are mostly dire" and called them "abuse". Is technique 3 "dire" and "abuse" ? To me, it seems a valid way to avoid techniques that might otherwise be regarded as "too much boilerplate". dandl's intended meaning where he says "shorter", afaict. So if your answer is "yes" then my next question is "why" and if it is "no" then my next question is "then what does an abusive use look like".
(Remember I was asking just out of curiosity.)
Quote from dandl on April 18, 2021, 1:48 pmQuote from Dave Voorhis on April 18, 2021, 11:52 amQuote from dandl on April 18, 2021, 2:55 amQuote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
Well, yes. That's indeed partly what started the recent spate of forum activity -- talking about what might make an ideal language to host development of a D, or even better, could be a full D by merely writing some (must be statically typed!) "relational algebra" libraries, rather than extension, generation, only run-time safety, or employing whole new interpreters/compilers.
But that said, for the majority of ordinary (for some undefined value of "ordinary") day-to-day 3GL-level work, Java is fine. Not as safe or elegant as Kotlin, say, or (e.g.) as feature-rich as C#, but certainly fine for typical 3GL jobs. Definitely much superior to C++. That's why Java is heavily used and why alternatives like Kotlin, Scala, Groovy (ugh!), even C# or Python, haven't even started to replace it.
Taking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
That will cover the majority of application programming. For low-level close(r)-to-the-metal (however you define it) 3GL-is-the-right-level tool/systems development, we have Rust, Kotlin, and almost certainly continued use of C#, Java and even C and C++.
Where these are the wrong tools for the job, obviously they are unsuited, unsafe, unduly verbose, inexpressive, whatever.
Where they are the right tools for the job, they suit it well.
C++ => Rust and friends is a reasonable path for low level, where the need for safety around memory allocation and concurrency is urgent. For general computing, they are the wrong things to focus on.
Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
The main thing it lost was high-performance function pointers. There are workarounds, but they're slow. Though I've been meaning to experiment with lambdas to see if they can effectively (in terms of performance) be used instead.
I've not encountered any particular limitation due to missing immutable types, type aliases, text macros, etc.
Missing conveniences need to be distinguished from absent necessities.
Performance is another wrong thing to focus on.If the language is good enough, the compiler can figure out how to make it fast enough.
Immutable types are not a good fit in Java (problematic in C# too) as long as you have new and null pointers. Immutable types are part of safer: telling the compiler how to avoid bugs.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Yes, I really see Java going on for the next 50 years, much as C has been going on for the last 50 years.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
The real challenge (as it is in every startup) is to figure out which problem to solve, who has the problem, and how much a solution is worth. Once you nail that, the rest is just hard grind.
Quote from Dave Voorhis on April 18, 2021, 11:52 amQuote from dandl on April 18, 2021, 2:55 amQuote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
Well, yes. That's indeed partly what started the recent spate of forum activity -- talking about what might make an ideal language to host development of a D, or even better, could be a full D by merely writing some (must be statically typed!) "relational algebra" libraries, rather than extension, generation, only run-time safety, or employing whole new interpreters/compilers.
But that said, for the majority of ordinary (for some undefined value of "ordinary") day-to-day 3GL-level work, Java is fine. Not as safe or elegant as Kotlin, say, or (e.g.) as feature-rich as C#, but certainly fine for typical 3GL jobs. Definitely much superior to C++. That's why Java is heavily used and why alternatives like Kotlin, Scala, Groovy (ugh!), even C# or Python, haven't even started to replace it.
Taking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
That will cover the majority of application programming. For low-level close(r)-to-the-metal (however you define it) 3GL-is-the-right-level tool/systems development, we have Rust, Kotlin, and almost certainly continued use of C#, Java and even C and C++.
Where these are the wrong tools for the job, obviously they are unsuited, unsafe, unduly verbose, inexpressive, whatever.
Where they are the right tools for the job, they suit it well.
C++ => Rust and friends is a reasonable path for low level, where the need for safety around memory allocation and concurrency is urgent. For general computing, they are the wrong things to focus on.
Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
The main thing it lost was high-performance function pointers. There are workarounds, but they're slow. Though I've been meaning to experiment with lambdas to see if they can effectively (in terms of performance) be used instead.
I've not encountered any particular limitation due to missing immutable types, type aliases, text macros, etc.
Missing conveniences need to be distinguished from absent necessities.
Performance is another wrong thing to focus on.If the language is good enough, the compiler can figure out how to make it fast enough.
Immutable types are not a good fit in Java (problematic in C# too) as long as you have new and null pointers. Immutable types are part of safer: telling the compiler how to avoid bugs.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Yes, I really see Java going on for the next 50 years, much as C has been going on for the last 50 years.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
The real challenge (as it is in every startup) is to figure out which problem to solve, who has the problem, and how much a solution is worth. Once you nail that, the rest is just hard grind.
Quote from Dave Voorhis on April 18, 2021, 4:32 pmQuote from dandl on April 18, 2021, 1:48 pmQuote from Dave Voorhis on April 18, 2021, 11:52 amQuote from dandl on April 18, 2021, 2:55 amQuote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
Well, yes. That's indeed partly what started the recent spate of forum activity -- talking about what might make an ideal language to host development of a D, or even better, could be a full D by merely writing some (must be statically typed!) "relational algebra" libraries, rather than extension, generation, only run-time safety, or employing whole new interpreters/compilers.
But that said, for the majority of ordinary (for some undefined value of "ordinary") day-to-day 3GL-level work, Java is fine. Not as safe or elegant as Kotlin, say, or (e.g.) as feature-rich as C#, but certainly fine for typical 3GL jobs. Definitely much superior to C++. That's why Java is heavily used and why alternatives like Kotlin, Scala, Groovy (ugh!), even C# or Python, haven't even started to replace it.
Taking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
This highlights one of the reasons the terms 3GL, 4GL, etc. were never used much in academic discourse, and aren't really used any more in casual discussion either. But, yes, the mental processes for all of the above -- except aspects of machine code and assembly languages -- are essentially the same. I don't find the details of the languages themselves to be particularly problematic. What I find problematic is building skyscraper-sized software systems out of low(ish) level toothpicks.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
They get traction in specialist niches, where they demonstrate dramatic improvements in productivity over the usual imperative alternatives. If their benefits can be generalised -- or at least made more general -- than narrow specialist niches, then they offer potential benefits that no amount of rearranging imperative programming can possibly reach.
That will cover the majority of application programming. For low-level close(r)-to-the-metal (however you define it) 3GL-is-the-right-level tool/systems development, we have Rust, Kotlin, and almost certainly continued use of C#, Java and even C and C++.
Where these are the wrong tools for the job, obviously they are unsuited, unsafe, unduly verbose, inexpressive, whatever.
Where they are the right tools for the job, they suit it well.
C++ => Rust and friends is a reasonable path for low level, where the need for safety around memory allocation and concurrency is urgent. For general computing, they are the wrong things to focus on.
That's why for general relatively low level, but not on-the-metal level computing we have C#, Java, Kotlin, Python, Julia, bash/ksh/csh, you-name-it -- depending on what you need to do, of course. They're not all equally suited to the same things.
Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
The main thing it lost was high-performance function pointers. There are workarounds, but they're slow. Though I've been meaning to experiment with lambdas to see if they can effectively (in terms of performance) be used instead.
I've not encountered any particular limitation due to missing immutable types, type aliases, text macros, etc.
Missing conveniences need to be distinguished from absent necessities.
Performance is another wrong thing to focus on.
It was a thing I was focusing on at the time. I'm referencing a specific case -- the details of which are irrelevant here -- and there was no way in Java to achieve what I could with C/C++ function pointers, and had to accept the performance hit.
I've not found anything else that didn't have a Java equivalent or some reasonably performant workaround. (I appreciate that others might have found different blockers, or none at all.)
If the language is good enough, the compiler can figure out how to make it fast enough.
It didn't.
Immutable types are not a good fit in Java (problematic in C# too) as long as you have new and null pointers. Immutable types are part of safer: telling the compiler how to avoid bugs.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Yes, I really see Java going on for the next 50 years, much as C has been going on for the last 50 years.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
C is still significantly a first choice language. By some measures, more programming is embedded systems than business/enterprise IT or commercial application development, and it's mostly C. It may often be called C++ -- and compiled with a C++ compiler -- but minimal if any use of classes, new/delete, etc. It's essentially C.
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
If your goal is to create the next "big" language, maybe. The last significant small company commercial effort of any note in that vein was perhaps that "other" D, Digital Mars's D, and whilst you rarely see mention of it now, I understand it's doing well (relatively speaking.)
But creating useful libraries, frameworks, platforms, tools of other kinds which work with the current popular language pantheon seems like great opportunity, and that's hardly depressing.
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
Computational type theory is no more or less than the study of mechanisms to reduce possible bugs in programs, so there's probably still a lot to get out of that.
There are companies using functional and logical programming to build software tools that other companies buy and use. As long as that appears set to continue and grow -- and it does; this is cutting-edge stuff, these are absolutely not legacy products -- and as long as there isn't anything in general programming that precludes effective use of functional and logical programming (in general, there isn't) then the fact these paradigms currently largely thrive only in some niches represents opportunity, even if only (at least initially) to fill other niches.
Quote from dandl on April 18, 2021, 1:48 pmQuote from Dave Voorhis on April 18, 2021, 11:52 amQuote from dandl on April 18, 2021, 2:55 amQuote from Dave Voorhis on April 17, 2021, 9:55 amQuote from dandl on April 17, 2021, 8:58 amI know exactly what I want, and I'm working on making it, of course.
So ar eyou going to tell us, or is it a secret?
I've mentioned stuff I'm working on a number of times.
Mentioned is not what I had in mind. Everything you've mentioned sounds like Java is just fine, we just need to be a bit more creative how we use it. But Java is what it is, warts and all, and it's time to contemplate what comes next.
Well, yes. That's indeed partly what started the recent spate of forum activity -- talking about what might make an ideal language to host development of a D, or even better, could be a full D by merely writing some (must be statically typed!) "relational algebra" libraries, rather than extension, generation, only run-time safety, or employing whole new interpreters/compilers.
But that said, for the majority of ordinary (for some undefined value of "ordinary") day-to-day 3GL-level work, Java is fine. Not as safe or elegant as Kotlin, say, or (e.g.) as feature-rich as C#, but certainly fine for typical 3GL jobs. Definitely much superior to C++. That's why Java is heavily used and why alternatives like Kotlin, Scala, Groovy (ugh!), even C# or Python, haven't even started to replace it.
Taking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
This highlights one of the reasons the terms 3GL, 4GL, etc. were never used much in academic discourse, and aren't really used any more in casual discussion either. But, yes, the mental processes for all of the above -- except aspects of machine code and assembly languages -- are essentially the same. I don't find the details of the languages themselves to be particularly problematic. What I find problematic is building skyscraper-sized software systems out of low(ish) level toothpicks.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
They get traction in specialist niches, where they demonstrate dramatic improvements in productivity over the usual imperative alternatives. If their benefits can be generalised -- or at least made more general -- than narrow specialist niches, then they offer potential benefits that no amount of rearranging imperative programming can possibly reach.
That will cover the majority of application programming. For low-level close(r)-to-the-metal (however you define it) 3GL-is-the-right-level tool/systems development, we have Rust, Kotlin, and almost certainly continued use of C#, Java and even C and C++.
Where these are the wrong tools for the job, obviously they are unsuited, unsafe, unduly verbose, inexpressive, whatever.
Where they are the right tools for the job, they suit it well.
C++ => Rust and friends is a reasonable path for low level, where the need for safety around memory allocation and concurrency is urgent. For general computing, they are the wrong things to focus on.
That's why for general relatively low level, but not on-the-metal level computing we have C#, Java, Kotlin, Python, Julia, bash/ksh/csh, you-name-it -- depending on what you need to do, of course. They're not all equally suited to the same things.
Java lost meta-programming in the pursuit of safer; it also lost immutable types, type aliases and a few other minor things, all valuable in their own way, which means it lost expressiveness.
The main thing it lost was high-performance function pointers. There are workarounds, but they're slow. Though I've been meaning to experiment with lambdas to see if they can effectively (in terms of performance) be used instead.
I've not encountered any particular limitation due to missing immutable types, type aliases, text macros, etc.
Missing conveniences need to be distinguished from absent necessities.
Performance is another wrong thing to focus on.
It was a thing I was focusing on at the time. I'm referencing a specific case -- the details of which are irrelevant here -- and there was no way in Java to achieve what I could with C/C++ function pointers, and had to accept the performance hit.
I've not found anything else that didn't have a Java equivalent or some reasonably performant workaround. (I appreciate that others might have found different blockers, or none at all.)
If the language is good enough, the compiler can figure out how to make it fast enough.
It didn't.
Immutable types are not a good fit in Java (problematic in C# too) as long as you have new and null pointers. Immutable types are part of safer: telling the compiler how to avoid bugs.
Consider improvements to language, framework, and tool safety, expressiveness, clarity, and power. Avoid buzzwordy marketing-speak like "safer/higher/shorter" (which only implies we should use APL or Z notation), and look at work done over the last couple of decades to produce languages and systems that do improve safety, expressiveness, clarity, and power whilst avoiding some of the mistakes (*cough*text replacement macros*cough*templates*cough*non-obligatory handling of union/option type members*cough*) of the past.
The reason I'm emphasing those points is because they were the prime drivers to get to this point (particularly safer). C++ is already more expressive than Java, but it's unsafe and too concerned with low level stuff. Rust is safer than C++, but also very close to the metal. I look at Java (or C#) and I see nothing to add: the language is done, future tweaks are driven by new technology and new business needs. But it's unsafe, low level and full of accidental complexity, so that's what I would see needs to change. Revolution not evolution.
But you still aren't contributing. Do you really see Java going on for the next 50 years, or do you have any ideas at all for what life after Java (or D) might look like?
Yes, I really see Java going on for the next 50 years, much as C has been going on for the last 50 years.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
C is still significantly a first choice language. By some measures, more programming is embedded systems than business/enterprise IT or commercial application development, and it's mostly C. It may often be called C++ -- and compiled with a C++ compiler -- but minimal if any use of classes, new/delete, etc. It's essentially C.
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
If your goal is to create the next "big" language, maybe. The last significant small company commercial effort of any note in that vein was perhaps that "other" D, Digital Mars's D, and whilst you rarely see mention of it now, I understand it's doing well (relatively speaking.)
But creating useful libraries, frameworks, platforms, tools of other kinds which work with the current popular language pantheon seems like great opportunity, and that's hardly depressing.
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
Computational type theory is no more or less than the study of mechanisms to reduce possible bugs in programs, so there's probably still a lot to get out of that.
There are companies using functional and logical programming to build software tools that other companies buy and use. As long as that appears set to continue and grow -- and it does; this is cutting-edge stuff, these are absolutely not legacy products -- and as long as there isn't anything in general programming that precludes effective use of functional and logical programming (in general, there isn't) then the fact these paradigms currently largely thrive only in some niches represents opportunity, even if only (at least initially) to fill other niches.
Quote from Dave Voorhis on April 18, 2021, 4:41 pmQuote from Erwin on April 18, 2021, 11:55 amQuote from Dave Voorhis on April 17, 2021, 10:08 pmOk, at the point of my previous response I kind of thought I was beginning to get what you were asking, sort of.
But with your response, despite writing Java code pretty much every day for nearly 25 years and noting that your examples are clear and straightforward...
I have no idea what you're asking.
I must be missing something obvious.
You said "annotations are mostly dire" and called them "abuse". Is technique 3 "dire" and "abuse" ? To me, it seems a valid way to avoid techniques that might otherwise be regarded as "too much boilerplate". dandl's intended meaning where he says "shorter", afaict. So if your answer is "yes" then my next question is "why" and if it is "no" then my next question is "then what does an abusive use look like".
(Remember I was asking just out of curiosity.)
Ah, perhaps I was reading more into it than was intended.
I don't really have a problem with Technique 3 -- it's just using annotations essentially as intended, with runtime reflection to inquire as to whether a given class has been annotated with a particular user-defined annotation -- though I'm not clear as to the purpose.
What I (and others) have a problem with are using annotations to essentially define a sublanguage -- see Spring for the canonical example(s) -- but it's a sublanguage that cannot be orchestrated by the language that hosts it.
Quote from Erwin on April 18, 2021, 11:55 amQuote from Dave Voorhis on April 17, 2021, 10:08 pmOk, at the point of my previous response I kind of thought I was beginning to get what you were asking, sort of.
But with your response, despite writing Java code pretty much every day for nearly 25 years and noting that your examples are clear and straightforward...
I have no idea what you're asking.
I must be missing something obvious.
You said "annotations are mostly dire" and called them "abuse". Is technique 3 "dire" and "abuse" ? To me, it seems a valid way to avoid techniques that might otherwise be regarded as "too much boilerplate". dandl's intended meaning where he says "shorter", afaict. So if your answer is "yes" then my next question is "why" and if it is "no" then my next question is "then what does an abusive use look like".
(Remember I was asking just out of curiosity.)
Ah, perhaps I was reading more into it than was intended.
I don't really have a problem with Technique 3 -- it's just using annotations essentially as intended, with runtime reflection to inquire as to whether a given class has been annotated with a particular user-defined annotation -- though I'm not clear as to the purpose.
What I (and others) have a problem with are using annotations to essentially define a sublanguage -- see Spring for the canonical example(s) -- but it's a sublanguage that cannot be orchestrated by the language that hosts it.
Quote from dandl on April 19, 2021, 5:08 amTaking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
This highlights one of the reasons the terms 3GL, 4GL, etc. were never used much in academic discourse, and aren't really used any more in casual discussion either. But, yes, the mental processes for all of the above -- except aspects of machine code and assembly languages -- are essentially the same. I don't find the details of the languages themselves to be particularly problematic. What I find problematic is building skyscraper-sized software systems out of low(ish) level toothpicks.
No, my point is that the mental model changed a lot between Fortran/Cobol/Algol/Basic with the introduction of real type systems and user-defined types. I have many thousands of lines of code written pre-1985, and it really is about the toothpicks, as against the ice-cream sticks we're using now. ASM -> HLL was a big step, and HLL ->HLL with type systems was another. I would argue that the encapsulation features and generics are another such, because of how they allow these gigantic libraries to be packaged. But I've seen no such step in the past 20 years.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
They get traction in specialist niches, where they demonstrate dramatic improvements in productivity over the usual imperative alternatives. If their benefits can be generalised -- or at least made more general -- than narrow specialist niches, then they offer potential benefits that no amount of rearranging imperative programming can possibly reach.
So I tried to find some languages to test that idea. See https://builtin.com/software-engineering-perspectives/new-programming-languages and https://www.rankred.com/new-programming-languages-to-learn/. If there are other people out there who see things as I do, they will make themselves visible by creating new languages.
So I pruned the list by excluding languages
- that have a history of more than 20 years (any direct derivative of ML/OCaml, Python, Elixir?),
- those that simply try to make JS development saner (Elm, Typescript and probably Dart),
- two that focus rather strongly on the numerical niche (R, Julia)
- and I'm dubious about Go (boring, low level), Pony (not stable).
Of those that are left, I tried to find the five principles: safer-higher-shorter, meta and fits in (meaning interoperable with existing code/libraries).
- words for safer are very common, nominating the areas of type inference, memory, concurrency
- words for shorter appear often eg 'concise'
- words suggesting higher do not (except in the OCaml family)
- Crystal, Elixir, Groovy, Nim, Rust have meta-programming/macros
- There is a mix of native and VM, with words like 'interoperable' and 'performance'. Only Kotlin and Groovy target the JVM, but most offer some way of interacting with native code.
So this research tells me that most (4 of 5) of the principles I've been plugging are out there driving language development. Of this list, the only ones suitable for general app development right now are probably Groovy and Nim, with Crystal as an outsider. The principles of safe and meta are well-served, but any attempt to find higher is doomed to failure.
I've not found anything else that didn't have a Java equivalent or some reasonably performant workaround. (I appreciate that others might have found different blockers, or none at all.)
JIT code is usually fast enough, but you are the mercy of the GC. The benchmarks where C# comes out in front are those using non-GC value types.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
C is still significantly a first choice language. By some measures, more programming is embedded systems than business/enterprise IT or commercial application development, and it's mostly C. It may often be called C++ -- and compiled with a C++ compiler -- but minimal if any use of classes, new/delete, etc. It's essentially C.
I think you're splitting hairs. C occupied a particular position for far less than 50 years, and now would not be a first choice for many general computing applications (but there are niches it has moved into).
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
If your goal is to create the next "big" language, maybe. The last significant small company commercial effort of any note in that vein was perhaps that "other" D, Digital Mars's D, and whilst you rarely see mention of it now, I understand it's doing well (relatively speaking.)
But creating useful libraries, frameworks, platforms, tools of other kinds which work with the current popular language pantheon seems like great opportunity, and that's hardly depressing.
But that isn't the point of this thread, is it?
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
Computational type theory is no more or less than the study of mechanisms to reduce possible bugs in programs, so there's probably still a lot to get out of that.
That really is the target for my safer: not adding features and complexity but languages that let the compiler avoid whole classes of unsafe coding. Crystal and a couple of the others specifically nominate that.
There are companies using functional and logical programming to build software tools that other companies buy and use. As long as that appears set to continue and grow -- and it does; this is cutting-edge stuff, these are absolutely not legacy products -- and as long as there isn't anything in general programming that precludes effective use of functional and logical programming (in general, there isn't) then the fact these paradigms currently largely thrive only in some niches represents opportunity, even if only (at least initially) to fill other niches.
The FP languages are indeed survivors, but the ideas are well-established and do not attract followers. New language (revolution!) that incorporate these ideas as well as others stand a better chance.
Taking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
This highlights one of the reasons the terms 3GL, 4GL, etc. were never used much in academic discourse, and aren't really used any more in casual discussion either. But, yes, the mental processes for all of the above -- except aspects of machine code and assembly languages -- are essentially the same. I don't find the details of the languages themselves to be particularly problematic. What I find problematic is building skyscraper-sized software systems out of low(ish) level toothpicks.
No, my point is that the mental model changed a lot between Fortran/Cobol/Algol/Basic with the introduction of real type systems and user-defined types. I have many thousands of lines of code written pre-1985, and it really is about the toothpicks, as against the ice-cream sticks we're using now. ASM -> HLL was a big step, and HLL ->HLL with type systems was another. I would argue that the encapsulation features and generics are another such, because of how they allow these gigantic libraries to be packaged. But I've seen no such step in the past 20 years.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
They get traction in specialist niches, where they demonstrate dramatic improvements in productivity over the usual imperative alternatives. If their benefits can be generalised -- or at least made more general -- than narrow specialist niches, then they offer potential benefits that no amount of rearranging imperative programming can possibly reach.
So I tried to find some languages to test that idea. See https://builtin.com/software-engineering-perspectives/new-programming-languages and https://www.rankred.com/new-programming-languages-to-learn/. If there are other people out there who see things as I do, they will make themselves visible by creating new languages.
So I pruned the list by excluding languages
- that have a history of more than 20 years (any direct derivative of ML/OCaml, Python, Elixir?),
- those that simply try to make JS development saner (Elm, Typescript and probably Dart),
- two that focus rather strongly on the numerical niche (R, Julia)
- and I'm dubious about Go (boring, low level), Pony (not stable).
Of those that are left, I tried to find the five principles: safer-higher-shorter, meta and fits in (meaning interoperable with existing code/libraries).
- words for safer are very common, nominating the areas of type inference, memory, concurrency
- words for shorter appear often eg 'concise'
- words suggesting higher do not (except in the OCaml family)
- Crystal, Elixir, Groovy, Nim, Rust have meta-programming/macros
- There is a mix of native and VM, with words like 'interoperable' and 'performance'. Only Kotlin and Groovy target the JVM, but most offer some way of interacting with native code.
So this research tells me that most (4 of 5) of the principles I've been plugging are out there driving language development. Of this list, the only ones suitable for general app development right now are probably Groovy and Nim, with Crystal as an outsider. The principles of safe and meta are well-served, but any attempt to find higher is doomed to failure.
I've not found anything else that didn't have a Java equivalent or some reasonably performant workaround. (I appreciate that others might have found different blockers, or none at all.)
JIT code is usually fast enough, but you are the mercy of the GC. The benchmarks where C# comes out in front are those using non-GC value types.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
C is still significantly a first choice language. By some measures, more programming is embedded systems than business/enterprise IT or commercial application development, and it's mostly C. It may often be called C++ -- and compiled with a C++ compiler -- but minimal if any use of classes, new/delete, etc. It's essentially C.
I think you're splitting hairs. C occupied a particular position for far less than 50 years, and now would not be a first choice for many general computing applications (but there are niches it has moved into).
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
If your goal is to create the next "big" language, maybe. The last significant small company commercial effort of any note in that vein was perhaps that "other" D, Digital Mars's D, and whilst you rarely see mention of it now, I understand it's doing well (relatively speaking.)
But creating useful libraries, frameworks, platforms, tools of other kinds which work with the current popular language pantheon seems like great opportunity, and that's hardly depressing.
But that isn't the point of this thread, is it?
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
Computational type theory is no more or less than the study of mechanisms to reduce possible bugs in programs, so there's probably still a lot to get out of that.
That really is the target for my safer: not adding features and complexity but languages that let the compiler avoid whole classes of unsafe coding. Crystal and a couple of the others specifically nominate that.
There are companies using functional and logical programming to build software tools that other companies buy and use. As long as that appears set to continue and grow -- and it does; this is cutting-edge stuff, these are absolutely not legacy products -- and as long as there isn't anything in general programming that precludes effective use of functional and logical programming (in general, there isn't) then the fact these paradigms currently largely thrive only in some niches represents opportunity, even if only (at least initially) to fill other niches.
The FP languages are indeed survivors, but the ideas are well-established and do not attract followers. New language (revolution!) that incorporate these ideas as well as others stand a better chance.
Quote from Dave Voorhis on April 19, 2021, 8:46 amQuote from dandl on April 19, 2021, 5:08 amTaking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
This highlights one of the reasons the terms 3GL, 4GL, etc. were never used much in academic discourse, and aren't really used any more in casual discussion either. But, yes, the mental processes for all of the above -- except aspects of machine code and assembly languages -- are essentially the same. I don't find the details of the languages themselves to be particularly problematic. What I find problematic is building skyscraper-sized software systems out of low(ish) level toothpicks.
No, my point is that the mental model changed a lot between Fortran/Cobol/Algol/Basic with the introduction of real type systems and user-defined types. I have many thousands of lines of code written pre-1985, and it really is about the toothpicks, as against the ice-cream sticks we're using now. ASM -> HLL was a big step, and HLL ->HLL with type systems was another. I would argue that the encapsulation features and generics are another such, because of how they allow these gigantic libraries to be packaged. But I've seen no such step in the past 20 years.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
They get traction in specialist niches, where they demonstrate dramatic improvements in productivity over the usual imperative alternatives. If their benefits can be generalised -- or at least made more general -- than narrow specialist niches, then they offer potential benefits that no amount of rearranging imperative programming can possibly reach.
So I tried to find some languages to test that idea. See https://builtin.com/software-engineering-perspectives/new-programming-languages and https://www.rankred.com/new-programming-languages-to-learn/. If there are other people out there who see things as I do, they will make themselves visible by creating new languages.
So I pruned the list by excluding languages
- that have a history of more than 20 years (any direct derivative of ML/OCaml, Python, Elixir?),
- those that simply try to make JS development saner (Elm, Typescript and probably Dart),
- two that focus rather strongly on the numerical niche (R, Julia)
- and I'm dubious about Go (boring, low level), Pony (not stable).
Of those that are left, I tried to find the five principles: safer-higher-shorter, meta and fits in (meaning interoperable with existing code/libraries).
- words for safer are very common, nominating the areas of type inference, memory, concurrency
- words for shorter appear often eg 'concise'
- words suggesting higher do not (except in the OCaml family)
- Crystal, Elixir, Groovy, Nim, Rust have meta-programming/macros
- There is a mix of native and VM, with words like 'interoperable' and 'performance'. Only Kotlin and Groovy target the JVM, but most offer some way of interacting with native code.
So this research tells me that most (4 of 5) of the principles I've been plugging are out there driving language development. Of this list, the only ones suitable for general app development right now are probably Groovy and Nim, with Crystal as an outsider. The principles of safe and meta are well-served, but any attempt to find higher is doomed to failure.
That's perhaps because "higher" isn't a generally-used term. Indeed, general-purpose increases in abstraction level are relatively rare, and sometimes controversial as to whether they're really higher-order abstractions or low-level foundations, etc. Better, perhaps, to use terms like expressivity, compose-ability, etc. It's the same terminological focus as using "concise" instead of "shorter", because the former implies readability and understandability whereas the latter suggests overly-terse APL or Z notation.
You've missed Haskell, Eiffel, Erlang, Mercury, Kotlin, Scala...
Go is "boring"?
What does that mean?
But perhaps more relevant is to where it looks like you're going are specialist languages like SAS, SPSS, K, and -- this is a biggie -- numerous commercial in-house languages that are used to solve a category of problems, or target a specific industrial domain, and essentially don't show up in the usual lists at all. They're not general-purpose languages, they're not generally available (in some cases they're not "available" at all; they're strictly used in-house to deliver solutions), and (from what I've seen) do deliver significant productivity increases in their niches.
They're usually transpilers, emitting compile-able C, Java, Julia, and others.
I've not found anything else that didn't have a Java equivalent or some reasonably performant workaround. (I appreciate that others might have found different blockers, or none at all.)
JIT code is usually fast enough, but you are the mercy of the GC. The benchmarks where C# comes out in front are those using non-GC value types.
Yes, JIT code is usually fast enough, but in the case of function pointers the alternatives involve various slow bodgery like needing to instantiate objects and/or invoke via virtual methods. Notably, an early JVM language called Pizza specifically addressed the issue, as it was a recognised one.
But, as noted, the lambda mechanisms may have addressed it. I'll experiment with that when I get a moment.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
C is still significantly a first choice language. By some measures, more programming is embedded systems than business/enterprise IT or commercial application development, and it's mostly C. It may often be called C++ -- and compiled with a C++ compiler -- but minimal if any use of classes, new/delete, etc. It's essentially C.
I think you're splitting hairs. C occupied a particular position for far less than 50 years, and now would not be a first choice for many general computing applications (but there are niches it has moved into).
As I noted, in name it might not be the first choice -- it would be inaccurately labelled "C++" -- but much of embedded systems are coded in pure C.
And note C's position on the TIOBE Index. (But then note too that assembly language, classic VB, Delphi, and Fortran are rising stars -- the TIOBE Index is weird.)
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
If your goal is to create the next "big" language, maybe. The last significant small company commercial effort of any note in that vein was perhaps that "other" D, Digital Mars's D, and whilst you rarely see mention of it now, I understand it's doing well (relatively speaking.)
But creating useful libraries, frameworks, platforms, tools of other kinds which work with the current popular language pantheon seems like great opportunity, and that's hardly depressing.
But that isn't the point of this thread, is it?
There's a point?
Ah.
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
Computational type theory is no more or less than the study of mechanisms to reduce possible bugs in programs, so there's probably still a lot to get out of that.
That really is the target for my safer: not adding features and complexity but languages that let the compiler avoid whole classes of unsafe coding. Crystal and a couple of the others specifically nominate that.
Virtually any functional or logical programming language implies it, too, along with the inevitable reduction in complexity and increase an predictability that comes from minimising or isolating mutable state.
There are companies using functional and logical programming to build software tools that other companies buy and use. As long as that appears set to continue and grow -- and it does; this is cutting-edge stuff, these are absolutely not legacy products -- and as long as there isn't anything in general programming that precludes effective use of functional and logical programming (in general, there isn't) then the fact these paradigms currently largely thrive only in some niches represents opportunity, even if only (at least initially) to fill other niches.
The FP languages are indeed survivors, but the ideas are well-established and do not attract followers. New language (revolution!) that incorporate these ideas as well as others stand a better chance.
Indeed, new languages that incorporate ideas from functional and logical programming stand a better chance, as I was suggesting before. Merely offering safety plus unspecified notions of "shorter" and "higher" are essentially what Rust and Digital Mars's D provide now. The general movement in C#/.NET and Java/JVM is toward safer, definitely -- note the option types in C#, and Kotlin for the JVM -- but also facilities drawn from functional programming like pattern matching and, previously, LINQ and Streams.
Quote from dandl on April 19, 2021, 5:08 amTaking the first generation as machine code, 2nd as ASM, I would argue that 3GL is Fortran/Cobol/Basic, the languages with predefined types. To me, the languages of user-definable types are the true 4GLs: C and Algol just barely but Pascal, VB6, C++, Java/C# etc and the dynamic types are all on much the same level. Java is doing exactly the same jobs as VB6 before it: the main difference is the facilities for writing libraries and APIs. VBA lives on in the various Office products, and C# has a few niches (I'm writing for Unity at present), but as you say, the mental processes are similar, it's the confounded details that keep giving me hell!
This highlights one of the reasons the terms 3GL, 4GL, etc. were never used much in academic discourse, and aren't really used any more in casual discussion either. But, yes, the mental processes for all of the above -- except aspects of machine code and assembly languages -- are essentially the same. I don't find the details of the languages themselves to be particularly problematic. What I find problematic is building skyscraper-sized software systems out of low(ish) level toothpicks.
No, my point is that the mental model changed a lot between Fortran/Cobol/Algol/Basic with the introduction of real type systems and user-defined types. I have many thousands of lines of code written pre-1985, and it really is about the toothpicks, as against the ice-cream sticks we're using now. ASM -> HLL was a big step, and HLL ->HLL with type systems was another. I would argue that the encapsulation features and generics are another such, because of how they allow these gigantic libraries to be packaged. But I've seen no such step in the past 20 years.
The real point was to to characterise safer-higher-shorter as it has already been applied to get to where we are. From ASM to C to Java is roughly 20 year steps, and each is unarguably moving in that direction. So I'm asking: the 20 years are up, so where do you see that next step?
Functional programming. Goal-directed programming. Logic programming. Better type systems.
Why do you hold that view, with so little evidence to support it? There are languages offering each of those things, and they get previous little traction.
They get traction in specialist niches, where they demonstrate dramatic improvements in productivity over the usual imperative alternatives. If their benefits can be generalised -- or at least made more general -- than narrow specialist niches, then they offer potential benefits that no amount of rearranging imperative programming can possibly reach.
So I tried to find some languages to test that idea. See https://builtin.com/software-engineering-perspectives/new-programming-languages and https://www.rankred.com/new-programming-languages-to-learn/. If there are other people out there who see things as I do, they will make themselves visible by creating new languages.
So I pruned the list by excluding languages
- that have a history of more than 20 years (any direct derivative of ML/OCaml, Python, Elixir?),
- those that simply try to make JS development saner (Elm, Typescript and probably Dart),
- two that focus rather strongly on the numerical niche (R, Julia)
- and I'm dubious about Go (boring, low level), Pony (not stable).
Of those that are left, I tried to find the five principles: safer-higher-shorter, meta and fits in (meaning interoperable with existing code/libraries).
- words for safer are very common, nominating the areas of type inference, memory, concurrency
- words for shorter appear often eg 'concise'
- words suggesting higher do not (except in the OCaml family)
- Crystal, Elixir, Groovy, Nim, Rust have meta-programming/macros
- There is a mix of native and VM, with words like 'interoperable' and 'performance'. Only Kotlin and Groovy target the JVM, but most offer some way of interacting with native code.
So this research tells me that most (4 of 5) of the principles I've been plugging are out there driving language development. Of this list, the only ones suitable for general app development right now are probably Groovy and Nim, with Crystal as an outsider. The principles of safe and meta are well-served, but any attempt to find higher is doomed to failure.
That's perhaps because "higher" isn't a generally-used term. Indeed, general-purpose increases in abstraction level are relatively rare, and sometimes controversial as to whether they're really higher-order abstractions or low-level foundations, etc. Better, perhaps, to use terms like expressivity, compose-ability, etc. It's the same terminological focus as using "concise" instead of "shorter", because the former implies readability and understandability whereas the latter suggests overly-terse APL or Z notation.
You've missed Haskell, Eiffel, Erlang, Mercury, Kotlin, Scala...
Go is "boring"?
What does that mean?
But perhaps more relevant is to where it looks like you're going are specialist languages like SAS, SPSS, K, and -- this is a biggie -- numerous commercial in-house languages that are used to solve a category of problems, or target a specific industrial domain, and essentially don't show up in the usual lists at all. They're not general-purpose languages, they're not generally available (in some cases they're not "available" at all; they're strictly used in-house to deliver solutions), and (from what I've seen) do deliver significant productivity increases in their niches.
They're usually transpilers, emitting compile-able C, Java, Julia, and others.
I've not found anything else that didn't have a Java equivalent or some reasonably performant workaround. (I appreciate that others might have found different blockers, or none at all.)
JIT code is usually fast enough, but you are the mercy of the GC. The benchmarks where C# comes out in front are those using non-GC value types.
Yes, JIT code is usually fast enough, but in the case of function pointers the alternatives involve various slow bodgery like needing to instantiate objects and/or invoke via virtual methods. Notably, an early JVM language called Pizza specifically addressed the issue, as it was a recognised one.
But, as noted, the lambda mechanisms may have addressed it. I'll experiment with that when I get a moment.
That's stretching it a bit. C as a first choice programming language probably had a life of less than 20 years, from K&R to Stroustrup. The replacement of C by C++ is the kind of thing I expect should happen to Java. And please note: C++ wins because of safer-higher-shorter. This is exactly why C++ strings and collections win out over char* and pointer arithmetic, despite the added complexity.
C is still significantly a first choice language. By some measures, more programming is embedded systems than business/enterprise IT or commercial application development, and it's mostly C. It may often be called C++ -- and compiled with a C++ compiler -- but minimal if any use of classes, new/delete, etc. It's essentially C.
I think you're splitting hairs. C occupied a particular position for far less than 50 years, and now would not be a first choice for many general computing applications (but there are niches it has moved into).
As I noted, in name it might not be the first choice -- it would be inaccurately labelled "C++" -- but much of embedded systems are coded in pure C.
And note C's position on the TIOBE Index. (But then note too that assembly language, classic VB, Delphi, and Fortran are rising stars -- the TIOBE Index is weird.)
I also see new languages emerging based on functional programming, goal-directed programming, logic programming, and better type systems.
I don't see much chance -- or point -- for another new post-C++ mainly-imperative 3GL other than (perhaps) Rust. Existing attempts aren't that much stronger than C++ or Java or C#, so most developers (and their managers) won't even consider switching. Newcomers in the same vein would therefore need to add potent new capability, which is why I suggest looking to the functional/goal-directed/logic/advanced-type-systems world.
Thus, expect the future to be steady evolution of the popular languages we have now (C/C++/C#/Java/Python/JavaScript + Kotlin/Rust/Julia/Go/Swift/TypeScript + SQL) plus associated libraries/frameworks/platforms...
That's depressing.
If your goal is to create the next "big" language, maybe. The last significant small company commercial effort of any note in that vein was perhaps that "other" D, Digital Mars's D, and whilst you rarely see mention of it now, I understand it's doing well (relatively speaking.)
But creating useful libraries, frameworks, platforms, tools of other kinds which work with the current popular language pantheon seems like great opportunity, and that's hardly depressing.
But that isn't the point of this thread, is it?
There's a point?
Ah.
...At least, until dramatically more effective ways to develop software emerge from the functional/goal-directed/logic/advanced-type-systems world.
But of course I might be completely wrong. If there's anything we can say for certain about the IT industry, it's that in the long term it's almost completely unpredictable.
If that's who we're waiting for, it's even more depressing, but I think it's wrong. The FP tank is empty, Prolog is niche and apart from unions, there isn't much more to get out of type theory. [Please enlighten me if I missed something.]
Computational type theory is no more or less than the study of mechanisms to reduce possible bugs in programs, so there's probably still a lot to get out of that.
That really is the target for my safer: not adding features and complexity but languages that let the compiler avoid whole classes of unsafe coding. Crystal and a couple of the others specifically nominate that.
Virtually any functional or logical programming language implies it, too, along with the inevitable reduction in complexity and increase an predictability that comes from minimising or isolating mutable state.
There are companies using functional and logical programming to build software tools that other companies buy and use. As long as that appears set to continue and grow -- and it does; this is cutting-edge stuff, these are absolutely not legacy products -- and as long as there isn't anything in general programming that precludes effective use of functional and logical programming (in general, there isn't) then the fact these paradigms currently largely thrive only in some niches represents opportunity, even if only (at least initially) to fill other niches.
The FP languages are indeed survivors, but the ideas are well-established and do not attract followers. New language (revolution!) that incorporate these ideas as well as others stand a better chance.
Indeed, new languages that incorporate ideas from functional and logical programming stand a better chance, as I was suggesting before. Merely offering safety plus unspecified notions of "shorter" and "higher" are essentially what Rust and Digital Mars's D provide now. The general movement in C#/.NET and Java/JVM is toward safer, definitely -- note the option types in C#, and Kotlin for the JVM -- but also facilities drawn from functional programming like pattern matching and, previously, LINQ and Streams.