The Forum for Discussion about The Third Manifesto and Related Matters

Please or Register to create posts and topics.

The future of TTM

PreviousPage 2 of 8Next

Is there any technical/engineering field that doesn't need a good "burn and start over"?

How many do it?

Every startup.

In electronics, the move from tubes/valves to discrete semiconductors, and from discrete semiconductors to integrated circuits, and from manual VLSIC definitions to using VHDL, and from point-to-point wiring to through-hole PC boards to SMDs, were arguably closer to "burn and start over" revolutions than evolutions, but in each case -- with some grumbling, admittedly -- the move was inevitable. The cost savings and performance gains for each step were remarkable.

Over here in IT, do we have an equivalent?

From my personal dev experience: punch cards, time share, mainframe multi-user, 16-bit minicomputer, 32-bit minicomputer, CP/M microcomputer, IBM PC 640K, DOS 286/386, Basic, Windows 16, Visual Basic, Win32, COM/DNA, Java, Internet 1.0, SQL, ASP, Internet 2.0, C#, JS, NPM and so on. Every single one of those provided all new tools and techniques which appeared to have learned little or nothing from what went before. And because every single one was so different and so disruptive, most provided a way to keep doing it the same old way.

At least hardware breaks, wears out, goes in the bin. It seems old software never does.

But then, in electronics when you want a new whuzzit, you buy a new one and throw the old one away (or sell it on eBay and a vintage electronics collector like me buys it) and keep going without a hiccup.

Try that with your banking software, or air traffic control software, or hospital records system, or ERP system, or ...

And everyone understands unreliability and instability in physical devices. It's surprising (or perhaps not) that a level of flakiness which would have most folks chucking a gadget or tool in the bin is not only tolerated but weirdly embraced when it's software.

And the main problem is cost. It's massively expensive to rewrite software; it's risky (because the tech is so different); and the payback is way off in the future with no benefit to the current management.

I don't see this changing anytime soon.

Andl - A New Database Language - andl.org
Quote from Erwin on July 23, 2021, 8:38 pm
Quote from Dave Voorhis on July 23, 2021, 7:50 pm

Solutions need to save/earn money, save effort, and improve quality. The link between "improve quality" and "save/earn money" (and between "save effort" and "save/earn money") probably needs to be spelled out.

It helps too if it doesn't require burning the disk packs and starting over. Too many grand solutions require that, and there aren't many willing to pay for it let alone do it.

Even grand solutions that look like starting over ("we're going to scrap Java and switch to .NET Core!") aren't really.

The result of an assessment of "save/earn money" is too much crucially dependent on the time frame taken into account.  And nowadays' managers operate on the premisse that "short term" is the next fiscal quarter and "long term" is two years because that's how long they intend to stay at the company where they're holding that manager position.

I contend that "burning the disc packs and start over" ***WILL*** prove to pay off, say on a period of 20 yrs, at least if you're clever enough to start the burning with only the least crucial disc packs.  20 yrs does give you the slack to do that.

My theme is all about a future of TTM with no line of business applications and no shared databases. In this context most of the above does not apply. These are typically small pieces of software created by individuals or small teams, or larger applications of which the database is a small part. A games company that happens to use SQLite for managing saves in one of its games will readily switch to something different for the next game if there are benefits.

Yes, data quality is a big issue but those papers are largely irrelevant. The challenge is to take TTM ideas and implement them in-process within the same GP language used for the rest of the application. For the existing well-known compiled languages that means embedding TTM-alike features into C++/Java/C# or perhaps Scala/Rust/whatever. The key features are:

  • Relations as a data type
  • Attributes can use host language type system
  • DQL based on RA, with extensions (at least extend, aggregate, etclose)
  • Select, extend, aggregation can use host language functions
  • Updates by relational MA
  • Database features including DDL, constraints, transactions.

The final result should be at least as capable as SQLite in every detail, but all written in the host language with no SQL. If you're competing with SQL Server/Oracle/MySQL you're on the wrong train!

Andl - A New Database Language - andl.org
Quote from dandl on July 24, 2021, 12:51 am

Is there any technical/engineering field that doesn't need a good "burn and start over"?

How many do it?

Every startup.

I don't know any startups (and I know a lot of startups) that burn and start over. In the software space, they're invariably built on a stack of conventional, off-the-shelf (where the shelf is GitHub, usually) tools with a new development perched on top like a wee maraschino cherry on a multi-scoop sundae.  Approaches and tools tend to be almost dully repetitive in that space (and everywhere.)

There are a few that deviate in terms of tool/platform choice (e.g., a few that don't create a cloud-first, MongoDB-based Web site built with Python/TypeScript/Node/Vue/GraphQL/whatever) but the essential adherence to popular choices is pervasive.

In electronics, the move from tubes/valves to discrete semiconductors, and from discrete semiconductors to integrated circuits, and from manual VLSIC definitions to using VHDL, and from point-to-point wiring to through-hole PC boards to SMDs, were arguably closer to "burn and start over" revolutions than evolutions, but in each case -- with some grumbling, admittedly -- the move was inevitable. The cost savings and performance gains for each step were remarkable.

Over here in IT, do we have an equivalent?

From my personal dev experience: punch cards, time share, mainframe multi-user, 16-bit minicomputer, 32-bit minicomputer, CP/M microcomputer, IBM PC 640K, DOS 286/386, Basic, Windows 16, Visual Basic, Win32, COM/DNA, Java, Internet 1.0, SQL, ASP, Internet 2.0, C#, JS, NPM and so on. Every single one of those provided all new tools and techniques which appeared to have learned little or nothing from what went before. And because every single one was so different and so disruptive, most provided a way to keep doing it the same old way.

I guess this must be a matter of perception or definition. I wouldn't describe a single one of those as disruptive compared to what came before. In every case, they were evolutionary, not revolutionary, at least compared to the dramatic revolutions in electronics. Even the most significant innovations -- like (say) punchcards compared to interactive terminals, or SQL compared to record-oriented databases -- are akin in the electronics world to (say) moving from single-sided PC boards to multi-layer. Significant, but not revolutionary.

Probably the closest to real revolution (equivalent to electronics revolutions) that I've ever seen in computing is experimental work on moving from computers that run applications or parts of applications, to distributed swarms of tiny, low-power (in every sense) embedded processors (with no central "server" or anything like it) where they collectively cooperate to run "applications".

In such approaches, almost nothing is the same as what went before. No migration path there, it's a genuine "burn the disks" start-over with all-new computational foundations, tools, platforms, languages, interfaces, interactions, everything. But that's all still in the research stage.

At least hardware breaks, wears out, goes in the bin. It seems old software never does.

But then, in electronics when you want a new whuzzit, you buy a new one and throw the old one away (or sell it on eBay and a vintage electronics collector like me buys it) and keep going without a hiccup.

Try that with your banking software, or air traffic control software, or hospital records system, or ERP system, or ...

And everyone understands unreliability and instability in physical devices. It's surprising (or perhaps not) that a level of flakiness which would have most folks chucking a gadget or tool in the bin is not only tolerated but weirdly embraced when it's software.

And the main problem is cost. It's massively expensive to rewrite software; it's risky (because the tech is so different); and the payback is way off in the future with no benefit to the current management.

I don't see this changing anytime soon.

Indeed, it won't.

It's not even that the tech is so different. In computer science terms, it isn't; in a theoretical sense most IT tech "change" is barely more than a different paint colour and the shape of the decorative trim. It's that any change is too expensive and risky -- except, it seems, toward career-growth/CV-enhancingly fashionable different-but-same mass trends (e.g., Vue vs React vs Angular vs whatever + AWS/Azure/GC) on new projects (or new parts of old projects.)

I'm the forum administrator and lead developer of Rel. Email me at dave@armchair.mb.ca with the Subject 'TTM Forum'. Download Rel from https://reldb.org
Quote from dandl on July 24, 2021, 4:05 am
Quote from Erwin on July 23, 2021, 8:38 pm
Quote from Dave Voorhis on July 23, 2021, 7:50 pm

Solutions need to save/earn money, save effort, and improve quality. The link between "improve quality" and "save/earn money" (and between "save effort" and "save/earn money") probably needs to be spelled out.

It helps too if it doesn't require burning the disk packs and starting over. Too many grand solutions require that, and there aren't many willing to pay for it let alone do it.

Even grand solutions that look like starting over ("we're going to scrap Java and switch to .NET Core!") aren't really.

The result of an assessment of "save/earn money" is too much crucially dependent on the time frame taken into account.  And nowadays' managers operate on the premisse that "short term" is the next fiscal quarter and "long term" is two years because that's how long they intend to stay at the company where they're holding that manager position.

I contend that "burning the disc packs and start over" ***WILL*** prove to pay off, say on a period of 20 yrs, at least if you're clever enough to start the burning with only the least crucial disc packs.  20 yrs does give you the slack to do that.

My theme is all about a future of TTM with no line of business applications and no shared databases. In this context most of the above does not apply. These are typically small pieces of software created by individuals or small teams, or larger applications of which the database is a small part. A games company that happens to use SQLite for managing saves in one of its games will readily switch to something different for the next game if there are benefits.

Yes, data quality is a big issue but those papers are largely irrelevant. The challenge is to take TTM ideas and implement them in-process within the same GP language used for the rest of the application. For the existing well-known compiled languages that means embedding TTM-alike features into C++/Java/C# or perhaps Scala/Rust/whatever. The key features are:

  • Relations as a data type
  • Attributes can use host language type system
  • DQL based on RA, with extensions (at least extend, aggregate, etclose)
  • Select, extend, aggregation can use host language functions
  • Updates by relational MA
  • Database features including DDL, constraints, transactions.

The final result should be at least as capable as SQLite in every detail, but all written in the host language with no SQL. If you're competing with SQL Server/Oracle/MySQL you're on the wrong train!

A bit like Prevayler (https://prevayler.org/) but with relational operators, then?

Note that one of the big selling points around SQLite (other than using SQL, which developers generally treat as a necessary but uninteresting evil, similar to the way regular expressions are treated) is that it works cross-language (your Python code can write a SQLite db and your Java code can read it) and you can browse it with external tools like DB Browser (https://sqlitebrowser.org/).  Not sure how you'd handle that with native non-canonical types, but maybe the type-handling of cross-language object brokers like CORBA would provide some ideas, or maybe the focus would be on language integration and it simply wouldn't support cross-language portability or browsing.

I'm the forum administrator and lead developer of Rel. Email me at dave@armchair.mb.ca with the Subject 'TTM Forum'. Download Rel from https://reldb.org
Quote from Dave Voorhis on July 24, 2021, 8:03 am
Quote from dandl on July 24, 2021, 12:51 am

Is there any technical/engineering field that doesn't need a good "burn and start over"?

How many do it?

Every startup.

I don't know any startups (and I know a lot of startups) that burn and start over. In the software space, they're invariably built on a stack of conventional, off-the-shelf (where the shelf is GitHub, usually) tools with a new development perched on top like a wee maraschino cherry on a multi-scoop sundae.  Approaches and tools tend to be almost dully repetitive in that space (and everywhere.)

There are a few that deviate in terms of tool/platform choice (e.g., a few that don't create a cloud-first, MongoDB-based Web site built with Python/TypeScript/Node/Vue/GraphQL/whatever) but the essential adherence to popular choices is pervasive.

That's silly. We were discussing line of business corporate application development, with a code base going back decades, unwilling to burn their old code and start over with the latest framework. At best, they do a pilot or two. A startup is a burn and start over. Their entire competitive advantage may just that. Like Facebook: move fast and break things.

In electronics, the move from tubes/valves to discrete semiconductors, and from discrete semiconductors to integrated circuits, and from manual VLSIC definitions to using VHDL, and from point-to-point wiring to through-hole PC boards to SMDs, were arguably closer to "burn and start over" revolutions than evolutions, but in each case -- with some grumbling, admittedly -- the move was inevitable. The cost savings and performance gains for each step were remarkable.

Over here in IT, do we have an equivalent?

From my personal dev experience: punch cards, time share, mainframe multi-user, 16-bit minicomputer, 32-bit minicomputer, CP/M microcomputer, IBM PC 640K, DOS 286/386, Basic, Windows 16, Visual Basic, Win32, COM/DNA, Java, Internet 1.0, SQL, ASP, Internet 2.0, C#, JS, NPM and so on. Every single one of those provided all new tools and techniques which appeared to have learned little or nothing from what went before. And because every single one was so different and so disruptive, most provided a way to keep doing it the same old way.

I guess this must be a matter of perception or definition. I wouldn't describe a single one of those as disruptive compared to what came before. In every case, they were evolutionary, not revolutionary, at least compared to the dramatic revolutions in electronics. Even the most significant innovations -- like (say) punchcards compared to interactive terminals, or SQL compared to record-oriented databases -- are akin in the electronics world to (say) moving from single-sided PC boards to multi-layer. Significant, but not revolutionary.

Beyond basic Computer Science, there was not a single thing about programming a mainframe in the 1970s that carried forward into minicomputers of the 1980s: not a line of code, not a data format, not a basic software architecture. The big shifts were to the PC, to the web, to mobile and to embedded but even the smaller shifts came with a totally new toolset: think moving from green screen PC to Windows. Yes, the data formats and protocols and some languages now tend to stick around but everything else changes. Lamp or Java for the web is nothing like iPhone or Android. Game development is nothing like corporate database.

Probably the closest to real revolution (equivalent to electronics revolutions) that I've ever seen in computing is experimental work on moving from computers that run applications or parts of applications, to distributed swarms of tiny, low-power (in every sense) embedded processors (with no central "server" or anything like it) where they collectively cooperate to run "applications".

In such approaches, almost nothing is the same as what went before. No migration path there, it's a genuine "burn the disks" start-over with all-new computational foundations, tools, platforms, languages, interfaces, interactions, everything. But that's all still in the research stage.

It's not even that the tech is so different. In computer science terms, it isn't; in a theoretical sense most IT tech "change" is barely more than a different paint colour and the shape of the decorative trim. It's that any change is too expensive and risky -- except, it seems, toward career-growth/CV-enhancingly fashionable different-but-same mass trends (e.g., Vue vs React vs Angular vs whatever + AWS/Azure/GC) on new projects (or new parts of old projects.)

You picked 'same old', but compare that to writing code for the instrument panel in your car and there are virtually no points of contact.

Andl - A New Database Language - andl.org

In general, the real cost of system change is the cost of implementing original requirements that weren’t recognized by the original implementation. A small example is the fairly recent recognition by email providers that there is a requirement to do something when an account holder dies. Of course they can only Approximately infer Death by indirect technique. So only the approximate cost is ever known not the real cost.

Quote from dandl on July 24, 2021, 11:00 am
Quote from Dave Voorhis on July 24, 2021, 8:03 am
Quote from dandl on July 24, 2021, 12:51 am

Is there any technical/engineering field that doesn't need a good "burn and start over"?

How many do it?

Every startup.

I don't know any startups (and I know a lot of startups) that burn and start over. In the software space, they're invariably built on a stack of conventional, off-the-shelf (where the shelf is GitHub, usually) tools with a new development perched on top like a wee maraschino cherry on a multi-scoop sundae.  Approaches and tools tend to be almost dully repetitive in that space (and everywhere.)

There are a few that deviate in terms of tool/platform choice (e.g., a few that don't create a cloud-first, MongoDB-based Web site built with Python/TypeScript/Node/Vue/GraphQL/whatever) but the essential adherence to popular choices is pervasive.

That's silly. We were discussing line of business corporate application development, with a code base going back decades, unwilling to burn their old code and start over with the latest framework. At best, they do a pilot or two. A startup is a burn and start over. Their entire competitive advantage may just that. Like Facebook: move fast and break things.

In electronics, the move from tubes/valves to discrete semiconductors, and from discrete semiconductors to integrated circuits, and from manual VLSIC definitions to using VHDL, and from point-to-point wiring to through-hole PC boards to SMDs, were arguably closer to "burn and start over" revolutions than evolutions, but in each case -- with some grumbling, admittedly -- the move was inevitable. The cost savings and performance gains for each step were remarkable.

Over here in IT, do we have an equivalent?

From my personal dev experience: punch cards, time share, mainframe multi-user, 16-bit minicomputer, 32-bit minicomputer, CP/M microcomputer, IBM PC 640K, DOS 286/386, Basic, Windows 16, Visual Basic, Win32, COM/DNA, Java, Internet 1.0, SQL, ASP, Internet 2.0, C#, JS, NPM and so on. Every single one of those provided all new tools and techniques which appeared to have learned little or nothing from what went before. And because every single one was so different and so disruptive, most provided a way to keep doing it the same old way.

I guess this must be a matter of perception or definition. I wouldn't describe a single one of those as disruptive compared to what came before. In every case, they were evolutionary, not revolutionary, at least compared to the dramatic revolutions in electronics. Even the most significant innovations -- like (say) punchcards compared to interactive terminals, or SQL compared to record-oriented databases -- are akin in the electronics world to (say) moving from single-sided PC boards to multi-layer. Significant, but not revolutionary.

Beyond basic Computer Science, there was not a single thing about programming a mainframe in the 1970s that carried forward into minicomputers of the 1980s: not a line of code, not a data format, not a basic software architecture. The big shifts were to the PC, to the web, to mobile and to embedded but even the smaller shifts came with a totally new toolset: think moving from green screen PC to Windows. Yes, the data formats and protocols and some languages now tend to stick around but everything else changes. Lamp or Java for the web is nothing like iPhone or Android. Game development is nothing like corporate database.

Probably the closest to real revolution (equivalent to electronics revolutions) that I've ever seen in computing is experimental work on moving from computers that run applications or parts of applications, to distributed swarms of tiny, low-power (in every sense) embedded processors (with no central "server" or anything like it) where they collectively cooperate to run "applications".

In such approaches, almost nothing is the same as what went before. No migration path there, it's a genuine "burn the disks" start-over with all-new computational foundations, tools, platforms, languages, interfaces, interactions, everything. But that's all still in the research stage.

It's not even that the tech is so different. In computer science terms, it isn't; in a theoretical sense most IT tech "change" is barely more than a different paint colour and the shape of the decorative trim. It's that any change is too expensive and risky -- except, it seems, toward career-growth/CV-enhancingly fashionable different-but-same mass trends (e.g., Vue vs React vs Angular vs whatever + AWS/Azure/GC) on new projects (or new parts of old projects.)

You picked 'same old', but compare that to writing code for the instrument panel in your car and there are virtually no points of contact.

This has clearly turned into one of those debates where two people look at the same piece of string and one declares it's a short long-string and the other insists that no, it's a long short-string.

Not sure this is going anywhere in either case.

I'm the forum administrator and lead developer of Rel. Email me at dave@armchair.mb.ca with the Subject 'TTM Forum'. Download Rel from https://reldb.org
Quote from Dave Voorhis on July 24, 2021, 4:54 pm

Not sure this is going anywhere in either case.

Rather surprised to find out you still haven't learned these discussions always do go somewhere : in circles.  Every participant always ends up reverting to their own axioms/beliefs/... .  And sometimes someone throws in a, perhaps relatively tangential, remark that makes others realize "oh so he HAS been there and done that and, despite everything, has a certain sense of how it all just sucks".

Quote from Erwin on July 24, 2021, 7:54 pm
Quote from Dave Voorhis on July 24, 2021, 4:54 pm

Not sure this is going anywhere in either case.

Rather surprised to find out you still haven't learned these discussions always do go somewhere : in circles.

See https://xkcd.com/386/

Every participant always ends up reverting to their own axioms/beliefs/... .  And sometimes someone throws in a, perhaps relatively tangential, remark that makes others realize "oh so he HAS been there and done that and, despite everything, has a certain sense of how it all just sucks".

I presume you mean, "... because of everything ...", not "... despite everything ..."?

I'm the forum administrator and lead developer of Rel. Email me at dave@armchair.mb.ca with the Subject 'TTM Forum'. Download Rel from https://reldb.org

Not sure this is going anywhere in either case.

I made no claim. I reject the claim that hardware is categorically different from software in the range, degree and/or kind of change over the past 70-odd  years (since 'before transistor') as being not adequately made out or defended. I'm happy to leave it there.

Andl - A New Database Language - andl.org
PreviousPage 2 of 8Next