Organizations are dealing with growing pools of data, data processing pipelines and associated risks originating from data breaches, unauthorized access and poor data quality. The complex matrix of data protection and privacy regulations businesses need to comply with across geographies have been the driver for more streamlined methods to data governance. The approaches have moved from merely checking the box on compliance audits to design approaches, automated tests and tooling as well as integrated principles and procedures.
The project aims to help the audience get acquainted with the current approaches and practices for managing data. And through that reduce the operational risks to business. Bringing together a mix of experts across sectors and blending tools, frameworks, principles and design - the project will enable a cross section of teams within the organizations to address the blind spots and view data governance as a complete package instead of a siloed approach of specific processes and requirements in policy.
This presentation focuses on decentralised semantics and how the segregation of task-oriented objects within a standard layered architecture can provide a long-term solution for unifying a data language within (and between) distributed data ecosystems. From that lens, decentralised semantics is ontology-agnostic, offering a harmonisation solution between data models and data representation formats while providing a roadmap to resolve privacy-compliant data sharing between servers, networks, and across sectoral or jurisdictional boundaries.
What is “Decentralised semantics”?
Data semantics is the study of the meaning and use of data in any digital environment. In data semantics, the focus is on how a data object represents a concept or object in the real world. That definition also underpins the concept of decentralised semantics, the only difference being that the decentralised version is specific to distributed data ecosystems.
Decentralised semantics describes a data modelling methodology of layering and cryptographically binding task-specific objects (overlays) to a standard capture base, which, when combined, defines a complex digital object. The segregation of task-specific overlays enables dynamic semantic interoperability in the construction process of any digital object without compromising the objectual integrity of the semantic structure, its modular components, or the relationship between those objects.
Decentralised semantics provides a powerful solution for semantic interoperability, data harmonisation, internationalisation, and dynamic presentation.
Short presenter bio
Introducing Paul Knowles, a true pioneer in Big Data initiatives
Paul Knowles is Head of the Advisory Council at The Human Colossus Foundation, a non-profit technological organisation based in Geneva, Switzerland, and Chief Data Officer at MeDDEa Solutions AG, a Dynamic Data Economy (DDE) consultancy and service provider based in Basel, Switzerland. He is the inventor of Overlays Capture Architecture (OCA), a next-generation core public utility technology to harmonise data and semantics across data models and data representation formats. His reputation as a Decentralised Semantics expert and innovator stems from a 25-year career in Healthcare Data Science, where he has worked with companies including Roche, Novartis, GlaxoSmithKline, Amgen, and Pfizer.