'data language' / 'programming language'
Quote from Dave Voorhis on February 5, 2021, 11:50 pmQuote from dandl on February 5, 2021, 11:16 pmQuote from Dave Voorhis on February 5, 2021, 1:45 pmQuote from dandl on February 5, 2021, 1:35 pmQuote from Dave Voorhis on February 5, 2021, 12:49 pmQuote from dandl on February 5, 2021, 10:34 amQuote from Dave Voorhis on February 5, 2021, 9:26 amQuote from dandl on February 5, 2021, 8:18 am... In the model I showed there is a Data/View separation for exactly that purpose. This kind of requirement has a regularity to it that can be captured. But I do not propose to include business rules: they are algorithmic in nature and best expressed in code. I do propose 3 sub-languages:
- Regex for validation
- Format strings for display
- Expression evaluation, both for validation and for calculated fields.
Looks more and more like MS Access, it does.
I imagine the Access folks must have reached a point in their "Expression evaluation, both for validation and for calculated fields" implementation where they said, "Screw it, this is turning into a full-fledged language. Let's just plug in Visual Basic."
Not even close. A programming language is Turing Complete: it has state and control flow, which are here specifically excluded. These are a close match to the features in SQL, but for use in the client data model.
Yes, I imagine that's what the Access 1.0 developers said too, shortly before they decided to integrate Visual Basic. ;-)
Not a chance. This is 1992 and Bill Loves Basic. The dialect changed over the years, but it was always Basic.
Of course. But my point is not about the choice of which programming language they used, but the fact that they obviously decided they needed one.
And in the context of 1992 and earlier this was the right decision. It drove dBase, FoxPro, Powerflex and many others. There was no Java or C# or indeed any other usable language for business applications. VB was state of the art and every Office product followed down that path, whether users wanted it or not.
Yes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
I'm not convinced No/Low Code products make money. I've seen lots of attempts over the past 40 years, since I was interested enough to notice. I remember seeing ads for No/Low Code products (though they weren't called that) in Byte, Computerworld and Dr. Dobbs Journal. They've always bounced along in the application development background, seemingly no more or less important now than ever.
But that's not my point. My point is simply that you will inevitably want Turing-complete capability.
That doesn't mean you have to use it, of course, but you will want to provide it -- perhaps even as an option -- for when it's needed.
That's because no matter how powerful your expression evaluator might be, unless it's a full Turing-complete expression evaluator with library support, some of your customers will inevitably want to to do something that a simple expression evaluator can't do -- or is very awkward to do -- and you or the user will want to write code. It might be something conceptually simple -- like generating customer IDs according to some nutty algorithm (I've seen that), or sending an email to a manager whenever a row is inserted in a certain table (I've seen that), or retrieving an authorisation token from some corporate system to plug into a table (seen it), and then your otherwise all-CRUD system will need a wee bit of code, because that wee bit of code does something really important.
That's why MS Office products support VBA.
Obviously, the user who built the application might have to hire in coding ability to implement the coded functionality -- which if the application is important enough, they'll almost certainly be fine with -- and if your tool allows Turing-complete capability with a real language, they'll happily code it as needed and keep using your tool, glad that it offers the extensibility they need.
If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
Quote from dandl on February 5, 2021, 11:16 pmQuote from Dave Voorhis on February 5, 2021, 1:45 pmQuote from dandl on February 5, 2021, 1:35 pmQuote from Dave Voorhis on February 5, 2021, 12:49 pmQuote from dandl on February 5, 2021, 10:34 amQuote from Dave Voorhis on February 5, 2021, 9:26 amQuote from dandl on February 5, 2021, 8:18 am... In the model I showed there is a Data/View separation for exactly that purpose. This kind of requirement has a regularity to it that can be captured. But I do not propose to include business rules: they are algorithmic in nature and best expressed in code. I do propose 3 sub-languages:
- Regex for validation
- Format strings for display
- Expression evaluation, both for validation and for calculated fields.
Looks more and more like MS Access, it does.
I imagine the Access folks must have reached a point in their "Expression evaluation, both for validation and for calculated fields" implementation where they said, "Screw it, this is turning into a full-fledged language. Let's just plug in Visual Basic."
Not even close. A programming language is Turing Complete: it has state and control flow, which are here specifically excluded. These are a close match to the features in SQL, but for use in the client data model.
Yes, I imagine that's what the Access 1.0 developers said too, shortly before they decided to integrate Visual Basic. ;-)
Not a chance. This is 1992 and Bill Loves Basic. The dialect changed over the years, but it was always Basic.
Of course. But my point is not about the choice of which programming language they used, but the fact that they obviously decided they needed one.
And in the context of 1992 and earlier this was the right decision. It drove dBase, FoxPro, Powerflex and many others. There was no Java or C# or indeed any other usable language for business applications. VB was state of the art and every Office product followed down that path, whether users wanted it or not.
Yes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
I'm not convinced No/Low Code products make money. I've seen lots of attempts over the past 40 years, since I was interested enough to notice. I remember seeing ads for No/Low Code products (though they weren't called that) in Byte, Computerworld and Dr. Dobbs Journal. They've always bounced along in the application development background, seemingly no more or less important now than ever.
But that's not my point. My point is simply that you will inevitably want Turing-complete capability.
That doesn't mean you have to use it, of course, but you will want to provide it -- perhaps even as an option -- for when it's needed.
That's because no matter how powerful your expression evaluator might be, unless it's a full Turing-complete expression evaluator with library support, some of your customers will inevitably want to to do something that a simple expression evaluator can't do -- or is very awkward to do -- and you or the user will want to write code. It might be something conceptually simple -- like generating customer IDs according to some nutty algorithm (I've seen that), or sending an email to a manager whenever a row is inserted in a certain table (I've seen that), or retrieving an authorisation token from some corporate system to plug into a table (seen it), and then your otherwise all-CRUD system will need a wee bit of code, because that wee bit of code does something really important.
That's why MS Office products support VBA.
Obviously, the user who built the application might have to hire in coding ability to implement the coded functionality -- which if the application is important enough, they'll almost certainly be fine with -- and if your tool allows Turing-complete capability with a real language, they'll happily code it as needed and keep using your tool, glad that it offers the extensibility they need.
If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
Quote from dandl on February 6, 2021, 8:12 amYes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
I'm not convinced No/Low Code products make money. I've seen lots of attempts over the past 40 years, since I was interested enough to notice. I remember seeing ads for No/Low Code products (though they weren't called that) in Byte, Computerworld and Dr. Dobbs Journal. They've always bounced along in the application development background, seemingly no more or less important now than ever.
But that's not my point. My point is simply that you will inevitably want Turing-complete capability.
That doesn't mean you have to use it, of course, but you will want to provide it -- perhaps even as an option -- for when it's needed.
That isn't the point you've been pushing. Everything you've said is about code first, code everywhere, code-centric.
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
And by the way I reject totally anything based on the technology of 3o or 40 years ago, when there were no decent languages, no SQL and no GUIs. Please stick to current events or recent history, say 10 years or so.
That's because no matter how powerful your expression evaluator might be, unless it's a full Turing-complete expression evaluator with library support, some of your customers will inevitably want to to do something that a simple expression evaluator can't do -- or is very awkward to do -- and you or the user will want to write code. It might be something conceptually simple -- like generating customer IDs according to some nutty algorithm (I've seen that), or sending an email to a manager whenever a row is inserted in a certain table (I've seen that), or retrieving an authorisation token from some corporate system to plug into a table (seen it), and then your otherwise all-CRUD system will need a wee bit of code, because that wee bit of code does something really important.
That's why MS Office products support VBA.
See Proposition 1. There is a large range of data management and presentation problems that can be solve by constructing Excel spreadsheets with no code (a). Then once the requirements go beyond that, we go to code (b). The integration between the spreadsheet formulae and the VBA code is problematic (c). Same theme exactly.
Obviously, the user who built the application might have to hire in coding ability to implement the coded functionality -- which if the application is important enough, they'll almost certainly be fine with -- and if your tool allows Turing-complete capability with a real language, they'll happily code it as needed and keep using your tool, glad that it offers the extensibility they need.
If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
So rewrite your Excel spreadsheet if you have to, but you will still pick Excel as the starting point for your next problem.
Yes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
I'm not convinced No/Low Code products make money. I've seen lots of attempts over the past 40 years, since I was interested enough to notice. I remember seeing ads for No/Low Code products (though they weren't called that) in Byte, Computerworld and Dr. Dobbs Journal. They've always bounced along in the application development background, seemingly no more or less important now than ever.
But that's not my point. My point is simply that you will inevitably want Turing-complete capability.
That doesn't mean you have to use it, of course, but you will want to provide it -- perhaps even as an option -- for when it's needed.
That isn't the point you've been pushing. Everything you've said is about code first, code everywhere, code-centric.
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
And by the way I reject totally anything based on the technology of 3o or 40 years ago, when there were no decent languages, no SQL and no GUIs. Please stick to current events or recent history, say 10 years or so.
That's because no matter how powerful your expression evaluator might be, unless it's a full Turing-complete expression evaluator with library support, some of your customers will inevitably want to to do something that a simple expression evaluator can't do -- or is very awkward to do -- and you or the user will want to write code. It might be something conceptually simple -- like generating customer IDs according to some nutty algorithm (I've seen that), or sending an email to a manager whenever a row is inserted in a certain table (I've seen that), or retrieving an authorisation token from some corporate system to plug into a table (seen it), and then your otherwise all-CRUD system will need a wee bit of code, because that wee bit of code does something really important.
That's why MS Office products support VBA.
See Proposition 1. There is a large range of data management and presentation problems that can be solve by constructing Excel spreadsheets with no code (a). Then once the requirements go beyond that, we go to code (b). The integration between the spreadsheet formulae and the VBA code is problematic (c). Same theme exactly.
Obviously, the user who built the application might have to hire in coding ability to implement the coded functionality -- which if the application is important enough, they'll almost certainly be fine with -- and if your tool allows Turing-complete capability with a real language, they'll happily code it as needed and keep using your tool, glad that it offers the extensibility they need.
If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
So rewrite your Excel spreadsheet if you have to, but you will still pick Excel as the starting point for your next problem.
Quote from Dave Voorhis on February 6, 2021, 10:41 amQuote from dandl on February 6, 2021, 8:12 amYes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
I'm not convinced No/Low Code products make money. I've seen lots of attempts over the past 40 years, since I was interested enough to notice. I remember seeing ads for No/Low Code products (though they weren't called that) in Byte, Computerworld and Dr. Dobbs Journal. They've always bounced along in the application development background, seemingly no more or less important now than ever.
But that's not my point. My point is simply that you will inevitably want Turing-complete capability.
That doesn't mean you have to use it, of course, but you will want to provide it -- perhaps even as an option -- for when it's needed.
That isn't the point you've been pushing. Everything you've said is about code first, code everywhere, code-centric.
There are essentially two distinct topics under discussion here, and that may have led to some misunderstandings.
Topic #1 is about languages. My point about languages is to stop creating new ones (unless absolutely necessary, or truly empowering) in favour of leveraging or "amplifying" the ones we've got.
E.g., don't replace SQL with an ORM-QL that emits SQL, give me SQL and make it easier to use.
E.g., don't give me a new language to in addition to Java (or C#, Python, whatever) because Java isn't data/configuration/UI/whatever enough, give me a data/configuration/UI/whatever API and let me use Java.
Topic #2 is about automating CRUD and (perhaps to a lesser degree) "no code" or "low code" application development, which I argue has floated along as a specialist software category for decades, and from what I've seen actually hasn't changed much in functionality over that time other than move from desktops to mobile and Web. But that's largely beside the point.
My point about Topic #2 is that 100% no-code isn't sustainable. It will always need at least the ability to allow users to code, even if it isn't "code first."
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
The numeric proportions are right, but aren't applied the right way.
I argue that given a typical application A:
- 80% of A can be generated without code.
- 20% of A will need code snippets.
There only appears to be an unsolved problem integrating the two, which I suspect is caused by assuming that 80% of applications don't need code snippets at all (and so code is considered an afterthought) when in fact it should be that nearly 100% of applications need some code snippets and so code needs to be considered from the outset.
I also argue that MS Access did and still does a surprisingly good job in exactly that respect.
In fact, I'm surprised more developers aren't aware of it. There's a common (mistaken) assumption that Access is only an obsolete "VB-lite" VBA IDE, with some data-oriented widgets and a crappy almost-single-user DBMS thrown in the box, and you're expected to build applications by writing epic quantities of (gag) VB from scratch.
Indeed, it can be used that way. Used that way, it's awful, but... Well, it's awful.
How it works really well is by using it as intended, which is to forgo the local DBMS except for genuinely local desktop data, and:
- Create a new database document and use "link tables" to obtain live ODBC access to your various enterprise DBMSs, of which -- if you're like most organisations -- there are many, but maybe a dozen or so tables across them that you use all the time or are needed for a specific purpose. Those are the tables you link.
- Use the "Relationships" view to identify the foreign key constraints between your linked tables. (With MS SQL Server, I think it might do that for you automagically. Not sure.)
- Edit data in tables. If the default grids (which are really good -- they even implement foreign-key lookups if you did Step #2 above) aren't good enough:
- Use the Access wizards to auto-generate forms.
- Customise the resulting forms however you like.
- Create queries and reports as needed to extract and present data how you like.
That will cover 80% of the requirements for a typical data-oriented application -- or a typical corporate user's database editing requirements -- without any code. For the 20% of the typical application that needs code, you fire up VBA (Access integrates it nicely) and write it.
I've seen a few organisations embrace this approach and do it well, and for them it works really, really well. It definitely cuts down on the proliferation of ad-hoc Excel spreadsheets.
Though as I've mentioned, even at best this approach is going away, as organisations launch "digital transformation" initiatives to get rid of paper (again) and reduce manual data entry and replace both with better systems integration -- more "my system will talk to your system" and less "I'll type the contents of your printed report into my data-entry screen" -- and on-screen visualisation dashboards.
And by the way I reject totally anything based on the technology of 3o or 40 years ago, when there were no decent languages, no SQL and no GUIs. Please stick to current events or recent history, say 10 years or so.
That's because no matter how powerful your expression evaluator might be, unless it's a full Turing-complete expression evaluator with library support, some of your customers will inevitably want to to do something that a simple expression evaluator can't do -- or is very awkward to do -- and you or the user will want to write code. It might be something conceptually simple -- like generating customer IDs according to some nutty algorithm (I've seen that), or sending an email to a manager whenever a row is inserted in a certain table (I've seen that), or retrieving an authorisation token from some corporate system to plug into a table (seen it), and then your otherwise all-CRUD system will need a wee bit of code, because that wee bit of code does something really important.
That's why MS Office products support VBA.
See Proposition 1. There is a large range of data management and presentation problems that can be solve by constructing Excel spreadsheets with no code (a). Then once the requirements go beyond that, we go to code (b). The integration between the spreadsheet formulae and the VBA code is problematic (c). Same theme exactly.
Again, I think the proportion is not that 80% of (CRUD-based) applications can be written entirely no code and 20% of applications need some code. It's that nearly 100% of applications need 80% no-code + 20% code.
Of course, that's applications. If the problem can be solved with an Excel spreadsheet it typically will be -- except for the enlightened organisations that have embraced Access per what I wrote above, but even that only reduces Excel. It doesn't eliminate it.
As for integrating Excel and VBA, it's easy, assuming you need custom functions. See https://www.dummies.com/software/microsoft-office/excel/how-to-create-custom-excel-functions/
Other Excel automation is in the same vein.
Obviously, the user who built the application might have to hire in coding ability to implement the coded functionality -- which if the application is important enough, they'll almost certainly be fine with -- and if your tool allows Turing-complete capability with a real language, they'll happily code it as needed and keep using your tool, glad that it offers the extensibility they need.
If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
So rewrite your Excel spreadsheet if you have to, but you will still pick Excel as the starting point for your next problem.
Excel, yes.
A non-Excel no-code solution that started out easy and turned out to cost a lot to be rewritten, probably not.
Quote from dandl on February 6, 2021, 8:12 amYes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
I'm not convinced No/Low Code products make money. I've seen lots of attempts over the past 40 years, since I was interested enough to notice. I remember seeing ads for No/Low Code products (though they weren't called that) in Byte, Computerworld and Dr. Dobbs Journal. They've always bounced along in the application development background, seemingly no more or less important now than ever.
But that's not my point. My point is simply that you will inevitably want Turing-complete capability.
That doesn't mean you have to use it, of course, but you will want to provide it -- perhaps even as an option -- for when it's needed.
That isn't the point you've been pushing. Everything you've said is about code first, code everywhere, code-centric.
There are essentially two distinct topics under discussion here, and that may have led to some misunderstandings.
Topic #1 is about languages. My point about languages is to stop creating new ones (unless absolutely necessary, or truly empowering) in favour of leveraging or "amplifying" the ones we've got.
E.g., don't replace SQL with an ORM-QL that emits SQL, give me SQL and make it easier to use.
E.g., don't give me a new language to in addition to Java (or C#, Python, whatever) because Java isn't data/configuration/UI/whatever enough, give me a data/configuration/UI/whatever API and let me use Java.
Topic #2 is about automating CRUD and (perhaps to a lesser degree) "no code" or "low code" application development, which I argue has floated along as a specialist software category for decades, and from what I've seen actually hasn't changed much in functionality over that time other than move from desktops to mobile and Web. But that's largely beside the point.
My point about Topic #2 is that 100% no-code isn't sustainable. It will always need at least the ability to allow users to code, even if it isn't "code first."
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
The numeric proportions are right, but aren't applied the right way.
I argue that given a typical application A:
- 80% of A can be generated without code.
- 20% of A will need code snippets.
There only appears to be an unsolved problem integrating the two, which I suspect is caused by assuming that 80% of applications don't need code snippets at all (and so code is considered an afterthought) when in fact it should be that nearly 100% of applications need some code snippets and so code needs to be considered from the outset.
I also argue that MS Access did and still does a surprisingly good job in exactly that respect.
In fact, I'm surprised more developers aren't aware of it. There's a common (mistaken) assumption that Access is only an obsolete "VB-lite" VBA IDE, with some data-oriented widgets and a crappy almost-single-user DBMS thrown in the box, and you're expected to build applications by writing epic quantities of (gag) VB from scratch.
Indeed, it can be used that way. Used that way, it's awful, but... Well, it's awful.
How it works really well is by using it as intended, which is to forgo the local DBMS except for genuinely local desktop data, and:
- Create a new database document and use "link tables" to obtain live ODBC access to your various enterprise DBMSs, of which -- if you're like most organisations -- there are many, but maybe a dozen or so tables across them that you use all the time or are needed for a specific purpose. Those are the tables you link.
- Use the "Relationships" view to identify the foreign key constraints between your linked tables. (With MS SQL Server, I think it might do that for you automagically. Not sure.)
- Edit data in tables. If the default grids (which are really good -- they even implement foreign-key lookups if you did Step #2 above) aren't good enough:
- Use the Access wizards to auto-generate forms.
- Customise the resulting forms however you like.
- Create queries and reports as needed to extract and present data how you like.
That will cover 80% of the requirements for a typical data-oriented application -- or a typical corporate user's database editing requirements -- without any code. For the 20% of the typical application that needs code, you fire up VBA (Access integrates it nicely) and write it.
I've seen a few organisations embrace this approach and do it well, and for them it works really, really well. It definitely cuts down on the proliferation of ad-hoc Excel spreadsheets.
Though as I've mentioned, even at best this approach is going away, as organisations launch "digital transformation" initiatives to get rid of paper (again) and reduce manual data entry and replace both with better systems integration -- more "my system will talk to your system" and less "I'll type the contents of your printed report into my data-entry screen" -- and on-screen visualisation dashboards.
And by the way I reject totally anything based on the technology of 3o or 40 years ago, when there were no decent languages, no SQL and no GUIs. Please stick to current events or recent history, say 10 years or so.
That's because no matter how powerful your expression evaluator might be, unless it's a full Turing-complete expression evaluator with library support, some of your customers will inevitably want to to do something that a simple expression evaluator can't do -- or is very awkward to do -- and you or the user will want to write code. It might be something conceptually simple -- like generating customer IDs according to some nutty algorithm (I've seen that), or sending an email to a manager whenever a row is inserted in a certain table (I've seen that), or retrieving an authorisation token from some corporate system to plug into a table (seen it), and then your otherwise all-CRUD system will need a wee bit of code, because that wee bit of code does something really important.
That's why MS Office products support VBA.
See Proposition 1. There is a large range of data management and presentation problems that can be solve by constructing Excel spreadsheets with no code (a). Then once the requirements go beyond that, we go to code (b). The integration between the spreadsheet formulae and the VBA code is problematic (c). Same theme exactly.
Again, I think the proportion is not that 80% of (CRUD-based) applications can be written entirely no code and 20% of applications need some code. It's that nearly 100% of applications need 80% no-code + 20% code.
Of course, that's applications. If the problem can be solved with an Excel spreadsheet it typically will be -- except for the enlightened organisations that have embraced Access per what I wrote above, but even that only reduces Excel. It doesn't eliminate it.
As for integrating Excel and VBA, it's easy, assuming you need custom functions. See https://www.dummies.com/software/microsoft-office/excel/how-to-create-custom-excel-functions/
Other Excel automation is in the same vein.
Obviously, the user who built the application might have to hire in coding ability to implement the coded functionality -- which if the application is important enough, they'll almost certainly be fine with -- and if your tool allows Turing-complete capability with a real language, they'll happily code it as needed and keep using your tool, glad that it offers the extensibility they need.
If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
So rewrite your Excel spreadsheet if you have to, but you will still pick Excel as the starting point for your next problem.
Excel, yes.
A non-Excel no-code solution that started out easy and turned out to cost a lot to be rewritten, probably not.
Quote from Erwin on February 6, 2021, 9:44 pmQuote from dandl on February 5, 2021, 1:26 pmbut the products have been getting steadily better and now they're getting serious traction and making serious money.
Products "getting steadily better" and "getting serious traction" (in a world full of ever donkier donks) and "making serious money" are of course the ultimate argument to conclude any discussion supposed to be thoroughly grounded in science.
Quote from dandl on February 5, 2021, 1:26 pmbut the products have been getting steadily better and now they're getting serious traction and making serious money.
Products "getting steadily better" and "getting serious traction" (in a world full of ever donkier donks) and "making serious money" are of course the ultimate argument to conclude any discussion supposed to be thoroughly grounded in science.
Quote from dandl on February 6, 2021, 11:55 pmQuote from Erwin on February 6, 2021, 9:44 pmQuote from dandl on February 5, 2021, 1:26 pmbut the products have been getting steadily better and now they're getting serious traction and making serious money.
Products "getting steadily better" and "getting serious traction" (in a world full of ever donkier donks) and "making serious money" are of course the ultimate argument to conclude any discussion supposed to be thoroughly grounded in science.
Those are predominantly comments about human behaviour, which is as you know stubbornly resistant to science. They say there are people who want it, need it and will pay for it (unlike, it would seem, either TTM or the RM). Which may or may not make it worth pursuing.
Quote from Erwin on February 6, 2021, 9:44 pmQuote from dandl on February 5, 2021, 1:26 pmbut the products have been getting steadily better and now they're getting serious traction and making serious money.
Products "getting steadily better" and "getting serious traction" (in a world full of ever donkier donks) and "making serious money" are of course the ultimate argument to conclude any discussion supposed to be thoroughly grounded in science.
Those are predominantly comments about human behaviour, which is as you know stubbornly resistant to science. They say there are people who want it, need it and will pay for it (unlike, it would seem, either TTM or the RM). Which may or may not make it worth pursuing.
Quote from dandl on February 7, 2021, 12:48 amThere are essentially two distinct topics under discussion here, and that may have led to some misunderstandings.
Topic #1 is about languages. My point about languages is to stop creating new ones (unless absolutely necessary, or truly empowering) in favour of leveraging or "amplifying" the ones we've got.
E.g., don't replace SQL with an ORM-QL that emits SQL, give me SQL and make it easier to use.
E.g., don't give me a new language to in addition to Java (or C#, Python, whatever) because Java isn't data/configuration/UI/whatever enough, give me a data/configuration/UI/whatever API and let me use Java.
Agreed. Up until about 1995 there was a major shortage of good-enough languages, which is why Basic got as far as it did. If we accept that:
- the major GP languages are good enough for Turing Complete
- SQL, HTML, CSS, regex, XML, JSON, Yaml etc are each good enough for their respective purposes
we can stop looking for new ones.
Topic #2 is about automating CRUD and (perhaps to a lesser degree) "no code" or "low code" application development, which I argue has floated along as a specialist software category for decades, and from what I've seen actually hasn't changed much in functionality over that time other than move from desktops to mobile and Web. But that's largely beside the point.
My point about Topic #2 is that 100% no-code isn't sustainable. It will always need at least the ability to allow users to code, even if it isn't "code first."
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
The numeric proportions are right, but aren't applied the right way.
I argue that given a typical application A:
- 80% of A can be generated without code.
- 20% of A will need code snippets.
There only appears to be an unsolved problem integrating the two, which I suspect is caused by assuming that 80% of applications don't need code snippets at all (and so code is considered an afterthought) when in fact it should be that nearly 100% of applications need some code snippets and so code needs to be considered from the outset.
I also argue that MS Access did and still does a surprisingly good job in exactly that respect.
In fact, I'm surprised more developers aren't aware of it. There's a common (mistaken) assumption that Access is only an obsolete "VB-lite" VBA IDE, with some data-oriented widgets and a crappy almost-single-user DBMS thrown in the box, and you're expected to build applications by writing epic quantities of (gag) VB from scratch.
Indeed, it can be used that way. Used that way, it's awful, but... Well, it's awful.
Agreed. I thought this was what you were proposing.
How it works really well is by using it as intended, which is to forgo the local DBMS except for genuinely local desktop data, and:
- Create a new database document and use "link tables" to obtain live ODBC access to your various enterprise DBMSs, of which -- if you're like most organisations -- there are many, but maybe a dozen or so tables across them that you use all the time or are needed for a specific purpose. Those are the tables you link.
- Use the "Relationships" view to identify the foreign key constraints between your linked tables. (With MS SQL Server, I think it might do that for you automagically. Not sure.)
- Edit data in tables. If the default grids (which are really good -- they even implement foreign-key lookups if you did Step #2 above) aren't good enough:
- Use the Access wizards to auto-generate forms.
- Customise the resulting forms however you like.
- Create queries and reports as needed to extract and present data how you like.
That will cover 80% of the requirements for a typical data-oriented application -- or a typical corporate user's database editing requirements -- without any code. For the 20% of the typical application that needs code, you fire up VBA (Access integrates it nicely) and write it.
I've seen a few organisations embrace this approach and do it well, and for them it works really, really well. It definitely cuts down on the proliferation of ad-hoc Excel spreadsheets.
Yes. This is the Access version of exactly the process I have in mind. So, at a more abstract level, the process is:
- Create a series of documents representing the database, relationships, grids, forms, reports and queries that constitute 80% of the application
- Write code snippets in some GP language for the other 20%.
- Integrate the two (Access uses COM but there are better choices).
This is Proposition 1. As an aside, Unity game dev works the same way. There is a visual editor that constructs a scene model and related assets as data, and C# snippets (scripts) for the game logic. The integration tech is a C# API and callbacks.
Again, I think the proportion is not that 80% of (CRUD-based) applications can be written entirely no code and 20% of applications need some code. It's that nearly 100% of applications need 80% no-code + 20% code.
I didn't intend that. My wild guess is that maybe 10-20% of low end applications can survive without code, and struggle or die because there is no good way to add code snippets. The key point is they need a little code (snippets), not mostly code. IMO the failure of many of the nocode products is that they fail to integrate a full GP language, but try to sneak by with pseudo-languages or complexification of the nocode parts.
Of course, that's applications. If the problem can be solved with an Excel spreadsheet it typically will be -- except for the enlightened organisations that have embraced Access per what I wrote above, but even that only reduces Excel. It doesn't eliminate it.
As for integrating Excel and VBA, it's easy, assuming you need custom functions. See https://www.dummies.com/software/microsoft-office/excel/how-to-create-custom-excel-functions/
Other Excel automation is in the same vein.
That's using snippets, but Excel fails rather badly at not having an application structure. You have to write automation code and that really sucks. Doesn't stop people starting out with spreadsheets, but.
Anyhoo...a plausible path as I see it would be (same 3 steps):
- Replicate the data content equivalent of Access database, linkage, grid, forms, queries and reports as stand-alone documents, each with their own tightly specified published data structure. You can create those documents using your preferred techniques of data sub-langage (XML/JSON/Yaml/etc), serialisation code or visual editor (as per Access and Unity). Or perhaps even reverse-engineering Access.
- Write your snippets in your preferred GP language (Java, C#, JS, Python, whatever). Transpile as needed (Unity does that really well).
- Integrate using a modern API technology and/or callbacks (as per Unity).
I might call it TINA: This Is Not Access.
There are essentially two distinct topics under discussion here, and that may have led to some misunderstandings.
Topic #1 is about languages. My point about languages is to stop creating new ones (unless absolutely necessary, or truly empowering) in favour of leveraging or "amplifying" the ones we've got.
E.g., don't replace SQL with an ORM-QL that emits SQL, give me SQL and make it easier to use.
E.g., don't give me a new language to in addition to Java (or C#, Python, whatever) because Java isn't data/configuration/UI/whatever enough, give me a data/configuration/UI/whatever API and let me use Java.
Agreed. Up until about 1995 there was a major shortage of good-enough languages, which is why Basic got as far as it did. If we accept that:
- the major GP languages are good enough for Turing Complete
- SQL, HTML, CSS, regex, XML, JSON, Yaml etc are each good enough for their respective purposes
we can stop looking for new ones.
Topic #2 is about automating CRUD and (perhaps to a lesser degree) "no code" or "low code" application development, which I argue has floated along as a specialist software category for decades, and from what I've seen actually hasn't changed much in functionality over that time other than move from desktops to mobile and Web. But that's largely beside the point.
My point about Topic #2 is that 100% no-code isn't sustainable. It will always need at least the ability to allow users to code, even if it isn't "code first."
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
The numeric proportions are right, but aren't applied the right way.
I argue that given a typical application A:
- 80% of A can be generated without code.
- 20% of A will need code snippets.
There only appears to be an unsolved problem integrating the two, which I suspect is caused by assuming that 80% of applications don't need code snippets at all (and so code is considered an afterthought) when in fact it should be that nearly 100% of applications need some code snippets and so code needs to be considered from the outset.
I also argue that MS Access did and still does a surprisingly good job in exactly that respect.
In fact, I'm surprised more developers aren't aware of it. There's a common (mistaken) assumption that Access is only an obsolete "VB-lite" VBA IDE, with some data-oriented widgets and a crappy almost-single-user DBMS thrown in the box, and you're expected to build applications by writing epic quantities of (gag) VB from scratch.
Indeed, it can be used that way. Used that way, it's awful, but... Well, it's awful.
Agreed. I thought this was what you were proposing.
How it works really well is by using it as intended, which is to forgo the local DBMS except for genuinely local desktop data, and:
- Create a new database document and use "link tables" to obtain live ODBC access to your various enterprise DBMSs, of which -- if you're like most organisations -- there are many, but maybe a dozen or so tables across them that you use all the time or are needed for a specific purpose. Those are the tables you link.
- Use the "Relationships" view to identify the foreign key constraints between your linked tables. (With MS SQL Server, I think it might do that for you automagically. Not sure.)
- Edit data in tables. If the default grids (which are really good -- they even implement foreign-key lookups if you did Step #2 above) aren't good enough:
- Use the Access wizards to auto-generate forms.
- Customise the resulting forms however you like.
- Create queries and reports as needed to extract and present data how you like.
That will cover 80% of the requirements for a typical data-oriented application -- or a typical corporate user's database editing requirements -- without any code. For the 20% of the typical application that needs code, you fire up VBA (Access integrates it nicely) and write it.
I've seen a few organisations embrace this approach and do it well, and for them it works really, really well. It definitely cuts down on the proliferation of ad-hoc Excel spreadsheets.
Yes. This is the Access version of exactly the process I have in mind. So, at a more abstract level, the process is:
- Create a series of documents representing the database, relationships, grids, forms, reports and queries that constitute 80% of the application
- Write code snippets in some GP language for the other 20%.
- Integrate the two (Access uses COM but there are better choices).
This is Proposition 1. As an aside, Unity game dev works the same way. There is a visual editor that constructs a scene model and related assets as data, and C# snippets (scripts) for the game logic. The integration tech is a C# API and callbacks.
Again, I think the proportion is not that 80% of (CRUD-based) applications can be written entirely no code and 20% of applications need some code. It's that nearly 100% of applications need 80% no-code + 20% code.
I didn't intend that. My wild guess is that maybe 10-20% of low end applications can survive without code, and struggle or die because there is no good way to add code snippets. The key point is they need a little code (snippets), not mostly code. IMO the failure of many of the nocode products is that they fail to integrate a full GP language, but try to sneak by with pseudo-languages or complexification of the nocode parts.
Of course, that's applications. If the problem can be solved with an Excel spreadsheet it typically will be -- except for the enlightened organisations that have embraced Access per what I wrote above, but even that only reduces Excel. It doesn't eliminate it.
As for integrating Excel and VBA, it's easy, assuming you need custom functions. See https://www.dummies.com/software/microsoft-office/excel/how-to-create-custom-excel-functions/
Other Excel automation is in the same vein.
That's using snippets, but Excel fails rather badly at not having an application structure. You have to write automation code and that really sucks. Doesn't stop people starting out with spreadsheets, but.
Anyhoo...a plausible path as I see it would be (same 3 steps):
- Replicate the data content equivalent of Access database, linkage, grid, forms, queries and reports as stand-alone documents, each with their own tightly specified published data structure. You can create those documents using your preferred techniques of data sub-langage (XML/JSON/Yaml/etc), serialisation code or visual editor (as per Access and Unity). Or perhaps even reverse-engineering Access.
- Write your snippets in your preferred GP language (Java, C#, JS, Python, whatever). Transpile as needed (Unity does that really well).
- Integrate using a modern API technology and/or callbacks (as per Unity).
I might call it TINA: This Is Not Access.
Quote from Dave Voorhis on February 7, 2021, 1:02 amQuote from dandl on February 7, 2021, 12:48 amThere are essentially two distinct topics under discussion here, and that may have led to some misunderstandings.
Topic #1 is about languages. My point about languages is to stop creating new ones (unless absolutely necessary, or truly empowering) in favour of leveraging or "amplifying" the ones we've got.
E.g., don't replace SQL with an ORM-QL that emits SQL, give me SQL and make it easier to use.
E.g., don't give me a new language to in addition to Java (or C#, Python, whatever) because Java isn't data/configuration/UI/whatever enough, give me a data/configuration/UI/whatever API and let me use Java.
Agreed. Up until about 1995 there was a major shortage of good-enough languages, which is why Basic got as far as it did. If we accept that:
- the major GP languages are good enough for Turing Complete
- SQL, HTML, CSS, regex, XML, JSON, Yaml etc are each good enough for their respective purposes
we can stop looking for new ones.
Topic #2 is about automating CRUD and (perhaps to a lesser degree) "no code" or "low code" application development, which I argue has floated along as a specialist software category for decades, and from what I've seen actually hasn't changed much in functionality over that time other than move from desktops to mobile and Web. But that's largely beside the point.
My point about Topic #2 is that 100% no-code isn't sustainable. It will always need at least the ability to allow users to code, even if it isn't "code first."
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
The numeric proportions are right, but aren't applied the right way.
I argue that given a typical application A:
- 80% of A can be generated without code.
- 20% of A will need code snippets.
There only appears to be an unsolved problem integrating the two, which I suspect is caused by assuming that 80% of applications don't need code snippets at all (and so code is considered an afterthought) when in fact it should be that nearly 100% of applications need some code snippets and so code needs to be considered from the outset.
I also argue that MS Access did and still does a surprisingly good job in exactly that respect.
In fact, I'm surprised more developers aren't aware of it. There's a common (mistaken) assumption that Access is only an obsolete "VB-lite" VBA IDE, with some data-oriented widgets and a crappy almost-single-user DBMS thrown in the box, and you're expected to build applications by writing epic quantities of (gag) VB from scratch.
Indeed, it can be used that way. Used that way, it's awful, but... Well, it's awful.
Agreed. I thought this was what you were proposing.
How it works really well is by using it as intended, which is to forgo the local DBMS except for genuinely local desktop data, and:
- Create a new database document and use "link tables" to obtain live ODBC access to your various enterprise DBMSs, of which -- if you're like most organisations -- there are many, but maybe a dozen or so tables across them that you use all the time or are needed for a specific purpose. Those are the tables you link.
- Use the "Relationships" view to identify the foreign key constraints between your linked tables. (With MS SQL Server, I think it might do that for you automagically. Not sure.)
- Edit data in tables. If the default grids (which are really good -- they even implement foreign-key lookups if you did Step #2 above) aren't good enough:
- Use the Access wizards to auto-generate forms.
- Customise the resulting forms however you like.
- Create queries and reports as needed to extract and present data how you like.
That will cover 80% of the requirements for a typical data-oriented application -- or a typical corporate user's database editing requirements -- without any code. For the 20% of the typical application that needs code, you fire up VBA (Access integrates it nicely) and write it.
I've seen a few organisations embrace this approach and do it well, and for them it works really, really well. It definitely cuts down on the proliferation of ad-hoc Excel spreadsheets.
Yes. This is the Access version of exactly the process I have in mind. So, at a more abstract level, the process is:
- Create a series of documents representing the database, relationships, grids, forms, reports and queries that constitute 80% of the application
- Write code snippets in some GP language for the other 20%.
- Integrate the two (Access uses COM but there are better choices).
Does it use COM?
I've never noticed. From an Access point of view, that's completely in the background and invisible to the user/developer.
This is Proposition 1. As an aside, Unity game dev works the same way. There is a visual editor that constructs a scene model and related assets as data, and C# snippets (scripts) for the game logic. The integration tech is a C# API and callbacks.
Again, I think the proportion is not that 80% of (CRUD-based) applications can be written entirely no code and 20% of applications need some code. It's that nearly 100% of applications need 80% no-code + 20% code.
I didn't intend that. My wild guess is that maybe 10-20% of low end applications can survive without code, and struggle or die because there is no good way to add code snippets. The key point is they need a little code (snippets), not mostly code. IMO the failure of many of the nocode products is that they fail to integrate a full GP language, but try to sneak by with pseudo-languages or complexification of the nocode parts.
I agree.
Quote from dandl on February 7, 2021, 12:48 amThere are essentially two distinct topics under discussion here, and that may have led to some misunderstandings.
Topic #1 is about languages. My point about languages is to stop creating new ones (unless absolutely necessary, or truly empowering) in favour of leveraging or "amplifying" the ones we've got.
E.g., don't replace SQL with an ORM-QL that emits SQL, give me SQL and make it easier to use.
E.g., don't give me a new language to in addition to Java (or C#, Python, whatever) because Java isn't data/configuration/UI/whatever enough, give me a data/configuration/UI/whatever API and let me use Java.
Agreed. Up until about 1995 there was a major shortage of good-enough languages, which is why Basic got as far as it did. If we accept that:
- the major GP languages are good enough for Turing Complete
- SQL, HTML, CSS, regex, XML, JSON, Yaml etc are each good enough for their respective purposes
we can stop looking for new ones.
Topic #2 is about automating CRUD and (perhaps to a lesser degree) "no code" or "low code" application development, which I argue has floated along as a specialist software category for decades, and from what I've seen actually hasn't changed much in functionality over that time other than move from desktops to mobile and Web. But that's largely beside the point.
My point about Topic #2 is that 100% no-code isn't sustainable. It will always need at least the ability to allow users to code, even if it isn't "code first."
I started with Proposition 1: that for a large set of application requirements (window-on-data/filing cabinet, no workflow or rules):
- there is an application data model that can satisfy a high proportion (say 80%) without writing any code
- the other 20% require code snippets
- there is an unsolved problem about integrating the two.
If you now accept that Proposition 1 (which your last sentence implies), please say so and we move on. If your position is that you'll need code eventually so you should start with code, we have nothing to discuss.
The numeric proportions are right, but aren't applied the right way.
I argue that given a typical application A:
- 80% of A can be generated without code.
- 20% of A will need code snippets.
There only appears to be an unsolved problem integrating the two, which I suspect is caused by assuming that 80% of applications don't need code snippets at all (and so code is considered an afterthought) when in fact it should be that nearly 100% of applications need some code snippets and so code needs to be considered from the outset.
I also argue that MS Access did and still does a surprisingly good job in exactly that respect.
In fact, I'm surprised more developers aren't aware of it. There's a common (mistaken) assumption that Access is only an obsolete "VB-lite" VBA IDE, with some data-oriented widgets and a crappy almost-single-user DBMS thrown in the box, and you're expected to build applications by writing epic quantities of (gag) VB from scratch.
Indeed, it can be used that way. Used that way, it's awful, but... Well, it's awful.
Agreed. I thought this was what you were proposing.
How it works really well is by using it as intended, which is to forgo the local DBMS except for genuinely local desktop data, and:
- Create a new database document and use "link tables" to obtain live ODBC access to your various enterprise DBMSs, of which -- if you're like most organisations -- there are many, but maybe a dozen or so tables across them that you use all the time or are needed for a specific purpose. Those are the tables you link.
- Use the "Relationships" view to identify the foreign key constraints between your linked tables. (With MS SQL Server, I think it might do that for you automagically. Not sure.)
- Edit data in tables. If the default grids (which are really good -- they even implement foreign-key lookups if you did Step #2 above) aren't good enough:
- Use the Access wizards to auto-generate forms.
- Customise the resulting forms however you like.
- Create queries and reports as needed to extract and present data how you like.
That will cover 80% of the requirements for a typical data-oriented application -- or a typical corporate user's database editing requirements -- without any code. For the 20% of the typical application that needs code, you fire up VBA (Access integrates it nicely) and write it.
I've seen a few organisations embrace this approach and do it well, and for them it works really, really well. It definitely cuts down on the proliferation of ad-hoc Excel spreadsheets.
Yes. This is the Access version of exactly the process I have in mind. So, at a more abstract level, the process is:
- Create a series of documents representing the database, relationships, grids, forms, reports and queries that constitute 80% of the application
- Write code snippets in some GP language for the other 20%.
- Integrate the two (Access uses COM but there are better choices).
Does it use COM?
I've never noticed. From an Access point of view, that's completely in the background and invisible to the user/developer.
This is Proposition 1. As an aside, Unity game dev works the same way. There is a visual editor that constructs a scene model and related assets as data, and C# snippets (scripts) for the game logic. The integration tech is a C# API and callbacks.
Again, I think the proportion is not that 80% of (CRUD-based) applications can be written entirely no code and 20% of applications need some code. It's that nearly 100% of applications need 80% no-code + 20% code.
I didn't intend that. My wild guess is that maybe 10-20% of low end applications can survive without code, and struggle or die because there is no good way to add code snippets. The key point is they need a little code (snippets), not mostly code. IMO the failure of many of the nocode products is that they fail to integrate a full GP language, but try to sneak by with pseudo-languages or complexification of the nocode parts.
I agree.
Quote from AntC on February 7, 2021, 2:34 amQuote from Dave Voorhis on February 5, 2021, 11:50 pmQuote from dandl on February 5, 2021, 11:16 pmQuote from Dave Voorhis on February 5, 2021, 1:45 pmQuote from dandl on February 5, 2021, 1:35 pmQuote from Dave Voorhis on February 5, 2021, 12:49 pmQuote from dandl on February 5, 2021, 10:34 amQuote from Dave Voorhis on February 5, 2021, 9:26 amQuote from dandl on February 5, 2021, 8:18 am... In the model I showed there is a Data/View separation for exactly that purpose. This kind of requirement has a regularity to it that can be captured. But I do not propose to include business rules: they are algorithmic in nature and best expressed in code. I do propose 3 sub-languages:
- Regex for validation
- Format strings for display
- Expression evaluation, both for validation and for calculated fields.
Looks more and more like MS Access, it does.
I imagine the Access folks must have reached a point in their "Expression evaluation, both for validation and for calculated fields" implementation where they said, "Screw it, this is turning into a full-fledged language. Let's just plug in Visual Basic."
Not even close. A programming language is Turing Complete: it has state and control flow, which are here specifically excluded. These are a close match to the features in SQL, but for use in the client data model.
Yes, I imagine that's what the Access 1.0 developers said too, shortly before they decided to integrate Visual Basic. ;-)
Not a chance. This is 1992 and Bill Loves Basic. The dialect changed over the years, but it was always Basic.
Of course. But my point is not about the choice of which programming language they used, but the fact that they obviously decided they needed one.
And in the context of 1992 and earlier this was the right decision. It drove dBase, FoxPro, Powerflex and many others. There was no Java or C# or indeed any other usable language for business applications. VB was state of the art and every Office product followed down that path, whether users wanted it or not.
Yes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
As another data point ... Most of the large-scale ERP packages (including the middle-market ones Microsoft has bought up) come with Low Code point-and-click languages. (I'm not up to speed with whether Microsoft are trying to convert theirs to VBA, but that would be a huge effort for little commercial benefit.)
These are definitely not Turing-complete. (SAP/Oracle don't want you trying to prove the Collatz Conjecture inside their order entry, thank you ;-) Most of the packages' tables include user-defined fields, for custom data capture or status tracking. With point-and-click the 'programmer' places those fields on a screen, and can then build get focus/lose focus logic behind them, and/or database triggers when they get saved.
It's moot whether they "make money", because the profitability is wrapped up in the whole package purchase. The vendors more regard them as 'loss leaders', I guess, because all the competition offers something similar. And I don't see the vendors making huge efforts to enhance them: it's pretty low-level primitives, certainly without powerful abstractions or ability to extend. You want custom formatting/validation for your Customer IDs? Then paste the code into every screen/widget, don't expect to call a routine, and fersure don't even dream of calling Java/C#/Python.
... If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
There's armies of consultants-for-hire who code in this stuff. They're more BA's-who code rather than programmers, because most of the effort goes into the business analysis. The decision-makers at the client (not IT) just accept those consultants' fees as the cost of doing business. We are of course more in the world of the client works for the ERP vendor, and pays for the privilege; because changing your ERP on grounds its scripting language is poor (or any other particular part of it) would be the tail wagging the dog.
Quote from Dave Voorhis on February 5, 2021, 11:50 pmQuote from dandl on February 5, 2021, 11:16 pmQuote from Dave Voorhis on February 5, 2021, 1:45 pmQuote from dandl on February 5, 2021, 1:35 pmQuote from Dave Voorhis on February 5, 2021, 12:49 pmQuote from dandl on February 5, 2021, 10:34 amQuote from Dave Voorhis on February 5, 2021, 9:26 amQuote from dandl on February 5, 2021, 8:18 am... In the model I showed there is a Data/View separation for exactly that purpose. This kind of requirement has a regularity to it that can be captured. But I do not propose to include business rules: they are algorithmic in nature and best expressed in code. I do propose 3 sub-languages:
- Regex for validation
- Format strings for display
- Expression evaluation, both for validation and for calculated fields.
Looks more and more like MS Access, it does.
I imagine the Access folks must have reached a point in their "Expression evaluation, both for validation and for calculated fields" implementation where they said, "Screw it, this is turning into a full-fledged language. Let's just plug in Visual Basic."
Not even close. A programming language is Turing Complete: it has state and control flow, which are here specifically excluded. These are a close match to the features in SQL, but for use in the client data model.
Yes, I imagine that's what the Access 1.0 developers said too, shortly before they decided to integrate Visual Basic. ;-)
Not a chance. This is 1992 and Bill Loves Basic. The dialect changed over the years, but it was always Basic.
Of course. But my point is not about the choice of which programming language they used, but the fact that they obviously decided they needed one.
And in the context of 1992 and earlier this was the right decision. It drove dBase, FoxPro, Powerflex and many others. There was no Java or C# or indeed any other usable language for business applications. VB was state of the art and every Office product followed down that path, whether users wanted it or not.
Yes, again, my point isn't about which language was chosen. Whether it was VB or something else is irrelevant.
My point is that a Turing complete language was chosen. Not merely an expression evaluator, but a full-blown, Turing-Complete language.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
As another data point ... Most of the large-scale ERP packages (including the middle-market ones Microsoft has bought up) come with Low Code point-and-click languages. (I'm not up to speed with whether Microsoft are trying to convert theirs to VBA, but that would be a huge effort for little commercial benefit.)
These are definitely not Turing-complete. (SAP/Oracle don't want you trying to prove the Collatz Conjecture inside their order entry, thank you ;-) Most of the packages' tables include user-defined fields, for custom data capture or status tracking. With point-and-click the 'programmer' places those fields on a screen, and can then build get focus/lose focus logic behind them, and/or database triggers when they get saved.
It's moot whether they "make money", because the profitability is wrapped up in the whole package purchase. The vendors more regard them as 'loss leaders', I guess, because all the competition offers something similar. And I don't see the vendors making huge efforts to enhance them: it's pretty low-level primitives, certainly without powerful abstractions or ability to extend. You want custom formatting/validation for your Customer IDs? Then paste the code into every screen/widget, don't expect to call a routine, and fersure don't even dream of calling Java/C#/Python.
... If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
There's armies of consultants-for-hire who code in this stuff. They're more BA's-who code rather than programmers, because most of the effort goes into the business analysis. The decision-makers at the client (not IT) just accept those consultants' fees as the cost of doing business. We are of course more in the world of the client works for the ERP vendor, and pays for the privilege; because changing your ERP on grounds its scripting language is poor (or any other particular part of it) would be the tail wagging the dog.
Quote from dandl on February 7, 2021, 6:55 am
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
As another data point ... Most of the large-scale ERP packages (including the middle-market ones Microsoft has bought up) come with Low Code point-and-click languages. (I'm not up to speed with whether Microsoft are trying to convert theirs to VBA, but that would be a huge effort for little commercial benefit.)
Agreed, but I'm really talking about Airtable, Knack, Nintex, Quickbase and the like. Nobody keeps doing what they're doing if they're losing money.
But no, they do not in general have a GP language hiding in the wings.
... If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
There's armies of consultants-for-hire who code in this stuff. They're more BA's-who code rather than programmers, because most of the effort goes into the business analysis. The decision-makers at the client (not IT) just accept those consultants' fees as the cost of doing business. We are of course more in the world of the client works for the ERP vendor, and pays for the privilege; because changing your ERP on grounds its scripting language is poor (or any other particular part of it) would be the tail wagging the dog.
It's not a bad process: build the prototype/proof of concept in a nocode and then code it when the investment seems worth it. The aim is to raise that 'must code' hurdle.
But that was then and this is now, 30 years later. Those are tired old arguments that no longer apply. Those products always targeted programmers, or people who were prepared to learn how to write programs. There is a big market now for nocode products and they make money. If you don't want to engage with that, it's your choice. Stay with what you know. I see new horizons.
As another data point ... Most of the large-scale ERP packages (including the middle-market ones Microsoft has bought up) come with Low Code point-and-click languages. (I'm not up to speed with whether Microsoft are trying to convert theirs to VBA, but that would be a huge effort for little commercial benefit.)
Agreed, but I'm really talking about Airtable, Knack, Nintex, Quickbase and the like. Nobody keeps doing what they're doing if they're losing money.
But no, they do not in general have a GP language hiding in the wings.
... If your tool doesn't allow coding, the hired consultant -- or a savvy user -- will re-write the application in Java or C# or Python or something else, they won't pick your tool for the next project, and you'll lose a customer.
There's armies of consultants-for-hire who code in this stuff. They're more BA's-who code rather than programmers, because most of the effort goes into the business analysis. The decision-makers at the client (not IT) just accept those consultants' fees as the cost of doing business. We are of course more in the world of the client works for the ERP vendor, and pays for the privilege; because changing your ERP on grounds its scripting language is poor (or any other particular part of it) would be the tail wagging the dog.
It's not a bad process: build the prototype/proof of concept in a nocode and then code it when the investment seems worth it. The aim is to raise that 'must code' hurdle.
Quote from dandl on February 8, 2021, 5:07 amYes. This is the Access version of exactly the process I have in mind. So, at a more abstract level, the process is:
- Create a series of documents representing the database, relationships, grids, forms, reports and queries that constitute 80% of the application
- Write code snippets in some GP language for the other 20%.
- Integrate the two (Access uses COM but there are better choices).
I spent a little time with Access (it's been a long time) and yes indeed, it's the same model. Right across the top you can create tables, queries, forms (grid and item), reports and code. It's rather heavy on the UI visuals (as you would expect), but it certainly seems able to satisfy my Proposition 1.
Does it use COM?
I've never noticed. From an Access point of view, that's completely in the background and invisible to the user/developer.
Not really. The Component Object Model is totally based on objects and comes with a particular type system. There are things you can't do easily, and it doesn't set well with other GP languages such as Java or even C#. The same language works right across Office precisely because they all expose a COM model. You have to buy into a particular narrow set of technologies to be comfortable with Access, I think.
Which begs the question: is it alone on the island? Are there are any other products that can do this exact thing, without being tied to these specific technologies? There are lots of people out there who say then can, but really?
So, if you had a product of this kind, which roughly conformed to the Proposition 1 model and (unlike Access) was really friendly to all the tools you like, would you use it?
Yes. This is the Access version of exactly the process I have in mind. So, at a more abstract level, the process is:
- Create a series of documents representing the database, relationships, grids, forms, reports and queries that constitute 80% of the application
- Write code snippets in some GP language for the other 20%.
- Integrate the two (Access uses COM but there are better choices).
I spent a little time with Access (it's been a long time) and yes indeed, it's the same model. Right across the top you can create tables, queries, forms (grid and item), reports and code. It's rather heavy on the UI visuals (as you would expect), but it certainly seems able to satisfy my Proposition 1.
Does it use COM?
I've never noticed. From an Access point of view, that's completely in the background and invisible to the user/developer.
Not really. The Component Object Model is totally based on objects and comes with a particular type system. There are things you can't do easily, and it doesn't set well with other GP languages such as Java or even C#. The same language works right across Office precisely because they all expose a COM model. You have to buy into a particular narrow set of technologies to be comfortable with Access, I think.
Which begs the question: is it alone on the island? Are there are any other products that can do this exact thing, without being tied to these specific technologies? There are lots of people out there who say then can, but really?
So, if you had a product of this kind, which roughly conformed to the Proposition 1 model and (unlike Access) was really friendly to all the tools you like, would you use it?