Just to tell you that I really like this formulation below.
Indeed how to combine the elegant simple tree-worlds (json/csv-like) with
the more powerful but sometimes too powerful/complex graph-worlds (shacl,
json-ld, LPGs).

After 40 years in data modelling promoting a lot of linked data (especially
via standardization like CEN TC442 EN17632) and also buildingSMART
(utilizing good old ISO STEP tech like EXPRESS/SPFF but also XML tech), I
sometimes feel that we miss the critical mass (in use and software support).

I simply find it hard to explain our clients they need open GIS (GML,
CityGML, GeoSparql), open BIM (IFC, bSDD, IDS, BCF), open Semantics
(Turtle, JSON-LD, SKOS, RDF(S), OWL, SHACL(-AF)), open measurement data
(CSV, JSON, JSON for the web, ....) involving ~100 incompatible
technologies/standards. Did I already mention the CEN data templates
standard also promoted for digital product passports?

So ..... any integration of tech especially towards the simple-side (kind
of JSON Schema ++) would be welcome. The ++  would be
shape-like beyond mere required/dependent-required/pattern etc. Replace
"JSON Schema ++" with "Abstract SHACL" from below and you get a similar
idea....

Maybe "RDF1.2 / SHACL 1.2 all JSON-LD-serialized" is the thing I am after
but I also hope for more simplicity.

One part of me is heading to simpler json/json schema, graphql, jsonpath
etc., the other part knows I need more complexity wrt
constraints/restrictions/shapes to provide really semantic ontologies where
concepts/classes are defined necessary and sufficient in terms of
their defining properties......

So far my thoughts on SHACL, well in a bit broader context, ....

Michel


Op do 19 dec 2024 om 11:36 schreef Hans-Juergen Rennau
<[email protected]>:

>  As you asked for "any other SHACL thoughts", I tell mine.
> To me, the importance of SHACL is not restricted to RDF, it applies to the
> whole realm of data validation. The basic abstractions (shapes,
> constraints, expressions) are not tied to RDF, not even to data graphs. In
> general, I am interested in any attempts to bridge the gap between graph
> thinking and tree thinking (XML, HTML, JSON, CSV, ...), and SHACL, when
> viewed in an uncommonly abstract way, would be a promising perspective.
> Hans-Jürgen Rennau
> PS: Some work - theoretical and practical - in this direction is described
> here:
> https://www.parsqube.de/publikationen/combining-graph-and-tree-writing-shax-obtaining-shacl-xsd-and-more
>     Am Donnerstag, 19. Dezember 2024 um 10:18:46 MEZ hat Andy Seaborne <
> [email protected]> Folgendes geschrieben:
>
>
>
> On 18/12/2024 01:53, Nicholas Car wrote:
> > The W3C has formed a second Data Shapes WG to extend and maintain
> SHACL-based Recommendations (standards).
>
> > Please see the proposed Deliverables:
> >
> > - SHACL 1.2 Core
> >
> > - SHACL 1.2 SPARQL Extensions
> > - SHACL 1.2 User Interfaces
> > - SHACL 1.2 Inferencing Rules
> > - SHACL 1.2 Profiling
> >
> > - SHACL 1.2 Compact Syntax
>
> What do you want to see in SHACL 1.2?
>
> * Are there missing SHACL-Core constraints that you would find useful?
>
> * Do you make use of SHACL-SPARQL?
> * Do you find you have to use SHACL-SPARQL for simple tasks?
>
> * What do you think about SHACL compact syntax?
>
> * Would you use SHACL rules (SHACL inference)?
>
> * any other SHACL thoughts?
>
> I have joined the working group (as an ASF representative) and I'd like
> to hear what matters to you.
>
>     Andy
>
> N.B.
>
> The working group github repo is:
>
>     https://github.com/w3c/data-shapes
>
> The WG is just starting up, the repo has the original SHACL 1.0 documents.
>
> https://github.com/w3c/shacl is the community group github area and has
> the draft input documents and many community-raised issues.
>
>

Reply via email to