ANSWERED: Aesthetics of collection delimiters - braces vs brackets
Quote from AntC on April 10, 2021, 9:34 amQuote from Darren Duncan on April 10, 2021, 2:29 amQuote from dandl on April 10, 2021, 12:27 amJSON is absolutely standardised. It has strings, numbers, true/false, null, objects, arrays. Some libraries impose limits (such as floating point numbers instead of arbitrary precision) and some have extensions (comments and dates). If both ends are using the same schema, and assuming a convention for dates, then AFAIK all kinds of business data can be transferred. There are probably some corner cases, but I can't think of any right now. I'm using it right now for nested relational data and it works just fine.
Please have a look at this and share your thoughts:
http://seriot.ch/parsing_json.php - "Parsing JSON is a Minefield"
That is one of my sources explaining the non-standardization and issues with JSON.
Also https://github.com/nst/JSONTestSuite can be informative.
Here's one about YAML problems:
https://hitchdev.com/strictyaml/why/implicit-typing-removed/
Also YAML being a superset of JSON inherits anything about JSON.
If only the difficulties with SQL were as trivial and nit-picky as that lot. "Minefield"? That guy needs to get a life. Most of the parsing trouble can be characterised as: the file content doesn't comply with the standard, but the vendor/parser is trying to be friendly and make guesses. Sometimes it guesses wrong/inconsistently with other guesses. The 'fix' is: reject the file. Anyhoo it's not the standard that's at fault, but rather the parser/implementation.
In a few cases it's that there's implementation-defined limits on depth of nesting structures or on digits in a number/chars in a string/etc; or on how to represent/what counts as whitespace. Not sufficiently broken for me to be looking for an alternative, thank you.
@dandl There are less than 10 predefined primitive scalar types, everything else is composed from them. Why is this a problem?
Now that we learn Darren sees this as both an interchange format, and a programming language in its own right ...
I'd expect a programming language to be able to declare its own types, including enumerations, not be limited to structures composed from predefined types. Then values/objects of those types, including literals, to get embedded in Sets/Relations/ranges/etc.; also to be referenced in (overloadable/generic) functions over those types. Darren wants the whole shebang to be interchangeable.
@dandl It's obvious the schema can be serialised and transferred as JSON, but now there has to be agreement of meaning. You can serialise the DDL from SQL Server and send it over to Oracle, but it will be treated as just a chunk of data, not something special like a schema.
Now we're talking about not only the schema but also type decls, function definitions, etc. JSON merely conveys (structured) data, not its role to agree on meaning. That you can't interchange a SQL-Server schema with an Oracle schema is no surprise (nor is it Oracle's nor JSON's role to implement the 'meaning'). So there needs to be an interpreter that recognises the DDL and applies the declarations into Oracle. JSON is still a fine mechanism for conveying the DDL. This is not a reason to throw out JSON.
The hard part here is defining structures to represent program objects. Whether they use
{ }
or( )
or[ ]
to denote varieties of repeated/nested bits is neither here nor there. Just follow JSON. Please.
Quote from Darren Duncan on April 10, 2021, 2:29 amQuote from dandl on April 10, 2021, 12:27 amJSON is absolutely standardised. It has strings, numbers, true/false, null, objects, arrays. Some libraries impose limits (such as floating point numbers instead of arbitrary precision) and some have extensions (comments and dates). If both ends are using the same schema, and assuming a convention for dates, then AFAIK all kinds of business data can be transferred. There are probably some corner cases, but I can't think of any right now. I'm using it right now for nested relational data and it works just fine.
Please have a look at this and share your thoughts:
http://seriot.ch/parsing_json.php - "Parsing JSON is a Minefield"
That is one of my sources explaining the non-standardization and issues with JSON.
Also https://github.com/nst/JSONTestSuite can be informative.
Here's one about YAML problems:
https://hitchdev.com/strictyaml/why/implicit-typing-removed/
Also YAML being a superset of JSON inherits anything about JSON.
If only the difficulties with SQL were as trivial and nit-picky as that lot. "Minefield"? That guy needs to get a life. Most of the parsing trouble can be characterised as: the file content doesn't comply with the standard, but the vendor/parser is trying to be friendly and make guesses. Sometimes it guesses wrong/inconsistently with other guesses. The 'fix' is: reject the file. Anyhoo it's not the standard that's at fault, but rather the parser/implementation.
In a few cases it's that there's implementation-defined limits on depth of nesting structures or on digits in a number/chars in a string/etc; or on how to represent/what counts as whitespace. Not sufficiently broken for me to be looking for an alternative, thank you.
@dandl There are less than 10 predefined primitive scalar types, everything else is composed from them. Why is this a problem?
Now that we learn Darren sees this as both an interchange format, and a programming language in its own right ...
I'd expect a programming language to be able to declare its own types, including enumerations, not be limited to structures composed from predefined types. Then values/objects of those types, including literals, to get embedded in Sets/Relations/ranges/etc.; also to be referenced in (overloadable/generic) functions over those types. Darren wants the whole shebang to be interchangeable.
@dandl It's obvious the schema can be serialised and transferred as JSON, but now there has to be agreement of meaning. You can serialise the DDL from SQL Server and send it over to Oracle, but it will be treated as just a chunk of data, not something special like a schema.
Now we're talking about not only the schema but also type decls, function definitions, etc. JSON merely conveys (structured) data, not its role to agree on meaning. That you can't interchange a SQL-Server schema with an Oracle schema is no surprise (nor is it Oracle's nor JSON's role to implement the 'meaning'). So there needs to be an interpreter that recognises the DDL and applies the declarations into Oracle. JSON is still a fine mechanism for conveying the DDL. This is not a reason to throw out JSON.
The hard part here is defining structures to represent program objects. Whether they use { }
or ( )
or [ ]
to denote varieties of repeated/nested bits is neither here nor there. Just follow JSON. Please.
Quote from dandl on April 11, 2021, 12:37 amQuote from Darren Duncan on April 10, 2021, 2:29 amQuote from dandl on April 10, 2021, 12:27 amJSON is absolutely standardised. It has strings, numbers, true/false, null, objects, arrays. Some libraries impose limits (such as floating point numbers instead of arbitrary precision) and some have extensions (comments and dates). If both ends are using the same schema, and assuming a convention for dates, then AFAIK all kinds of business data can be transferred. There are probably some corner cases, but I can't think of any right now. I'm using it right now for nested relational data and it works just fine.
Please have a look at this and share your thoughts:
http://seriot.ch/parsing_json.php - "Parsing JSON is a Minefield"
That is one of my sources explaining the non-standardization and issues with JSON.
Also https://github.com/nst/JSONTestSuite can be informative.
Here's one about YAML problems:
https://hitchdev.com/strictyaml/why/implicit-typing-removed/
Also YAML being a superset of JSON inherits anything about JSON.
The JSON problems are all fixable, at the risk of breaking changes. YAML is beyond hope.
Quote from Darren Duncan on April 10, 2021, 2:29 amQuote from dandl on April 10, 2021, 12:27 amJSON is absolutely standardised. It has strings, numbers, true/false, null, objects, arrays. Some libraries impose limits (such as floating point numbers instead of arbitrary precision) and some have extensions (comments and dates). If both ends are using the same schema, and assuming a convention for dates, then AFAIK all kinds of business data can be transferred. There are probably some corner cases, but I can't think of any right now. I'm using it right now for nested relational data and it works just fine.
Please have a look at this and share your thoughts:
http://seriot.ch/parsing_json.php - "Parsing JSON is a Minefield"
That is one of my sources explaining the non-standardization and issues with JSON.
Also https://github.com/nst/JSONTestSuite can be informative.
Here's one about YAML problems:
https://hitchdev.com/strictyaml/why/implicit-typing-removed/
Also YAML being a superset of JSON inherits anything about JSON.
The JSON problems are all fixable, at the risk of breaking changes. YAML is beyond hope.