Metadata Publishing
   HOME
*





Metadata Publishing
Metadata publishing is the process of making metadata data elements available to external users, both people and machines using a formal review process and a commitment to change control processes. Metadata publishing is the foundation upon which advanced distributed computing functions are being built. But like building foundations, care must be taken in metadata publishing systems to ensure the structural integrity of the systems built on top of them. Definition of metadata publishing Published metadata has the following characteristics: # Metadata structures available to the general public on a public web site or by a download # There is a documented review and approval process for adding or updating data elements to the system # New releases are made available without disturbing prior versions # A publishing organization that makes a commitment to change control process Benefits of metadata publishing When classifying benefits of metadata publishing two groups are usually cons ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Metadata
Metadata is "data that provides information about other data", but not the content of the data, such as the text of a message or the image itself. There are many distinct types of metadata, including: * Descriptive metadata – the descriptive information about a resource. It is used for discovery and identification. It includes elements such as title, abstract, author, and keywords. * Structural metadata – metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters. It describes the types, versions, relationships, and other characteristics of digital materials. * Administrative metadata – the information to help manage a resource, like resource type, permissions, and when and how it was created. * Reference metadata – the information about the contents and quality of statistical data. * Statistical metadata – also called process data, may describe processes that collect, process, or produce st ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Metadata Publishing
Metadata publishing is the process of making metadata data elements available to external users, both people and machines using a formal review process and a commitment to change control processes. Metadata publishing is the foundation upon which advanced distributed computing functions are being built. But like building foundations, care must be taken in metadata publishing systems to ensure the structural integrity of the systems built on top of them. Definition of metadata publishing Published metadata has the following characteristics: # Metadata structures available to the general public on a public web site or by a download # There is a documented review and approval process for adding or updating data elements to the system # New releases are made available without disturbing prior versions # A publishing organization that makes a commitment to change control process Benefits of metadata publishing When classifying benefits of metadata publishing two groups are usually cons ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Topic Maps
A topic map is a standard for the representation and interchange of knowledge, with an emphasis on the findability of information. Topic maps were originally developed in the late 1990s as a way to represent back-of-the-book index structures so that multiple indexes from different sources could be merged. However, the developers quickly realized that with a little additional generalization, they could create a meta-model with potentially far wider application. The ISO/IEC standard is formally known as ISO/IEC 13250:2003. A topic map represents information using * ''topics'', representing any concept, from people, countries, and organizations to software modules, individual files, and events, * ''associations'', representing hypergraph relationships between ''topics'', and * ''occurrences'', representing information resources relevant to a particular ''topic''. Topic maps are similar to concept maps and mind maps in many respects, though only topic maps are ISO standards. Topi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




ISO/IEC 11179
The ISO/IEC 11179 Metadata Registry (MDR) standard is an international ISO/IEC standard for representing metadata for an organization in a metadata registry. It documents the standardization and registration of metadata to make data understandable and shareable. Intended purpose Organizations exchange data between computer systems precisely using enterprise application integration technologies. Completed transactions are often transferred to separate data warehouse and business rules systems with structures designed to support data for analysis. A de facto standard model for data integration platforms is the Common Warehouse Metamodel (CWM). Data integration is often also solved as a problem of data, rather than metadata, with the use of so-called master data. ISO/IEC 11179 claims that it is a standard for metadata-driven exchange of data in an heterogeneous environment, based on exact definitions of data. Structure of an ISO/IEC 11179 metadata registry The ISO/IEC 11179 mod ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Metadata Registry
A metadata registry is a central location in an organization where metadata definitions are stored and maintained in a controlled method. A metadata repository is the database where metadata is stored. The registry also adds relationships with related metadata types. A metadata engine collects, stores and analyzes information about data and metadata (data about data) in use within a domain. Use of metadata registries Metadata registries are used whenever data must be used consistently within an organization or group of organizations. Examples of these situations include: * Organizations that transmit data using structures such as XML, Web Services or EDI * Organizations that need consistent definitions of data across time, between databases, between organizations or between processes, for example when an organization builds a data warehouse * Organizations that are attempting to break down "silos" of information captured within applications or proprietary file formats Central ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Semantic Technology
The ultimate goal of semantic technology is to help machines understand data. To enable the encoding of semantics with the data, well-known technologies are RDF (Resource Description Framework) and OWL (Web Ontology Language). These technologies formally represent the meaning involved in information. For example, ontology can describe concepts, relationships between things, and categories of things. These embedded semantics with the data offer significant advantages such as reasoning over data and dealing with heterogeneous data sources. Overview In software, semantic technology encodes meanings separately from data and content files, and separately from application code. This enables machines as well as people to understand, share and reason with them at execution time. With semantic technologies, adding, changing and implementing new relationships or interconnecting programs in a different way can be just as simple as changing the external model that these programs share. With t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Data Governance
Data governance is a term used on both a macro and a micro level. The former is a political concept and forms part of international relations and Internet governance; the latter is a data management concept and forms part of corporate data governance. Macro level On the macro level, data governance refers to the governing of cross-border data flows by countries, and hence is more precisely called ''international data governance''. This new field consists of "norms, principles and rules governing various types of data." Micro level Here the focus is on an individual company. Here data governance is a data management concept concerning the capability that enables an organization to ensure that high data quality exists throughout the complete lifecycle of the data, and data controls are implemented that support business objectives. The key focus areas of data governance include availability, usability, consistency, data integrity and data security, standard compliance and incl ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Bibliographic Database
A bibliographic database is a database of bibliographic records, an organized digital collection of references to published literature, including journal and newspaper articles, conference proceedings, reports, government and legal publications, patents, books, etc. In contrast to library catalogue entries, a large proportion of the bibliographic records in bibliographic databases describe articles, conference papers, etc., rather than complete monographs, and they generally contain very rich subject descriptions in the form of keywords, subject classification terms, or abstracts. A bibliographic database may be general in scope or cover a specific academic discipline like computer science. A significant number of bibliographic databases are proprietary, available by licensing agreement from vendors, or directly from the indexing and abstracting services that create them. Many bibliographic databases have evolved into digital libraries, providing the full text of the indexed c ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Domain Specific Language
A domain-specific language (DSL) is a computer language specialized to a particular application domain. This is in contrast to a general-purpose language (GPL), which is broadly applicable across domains. There are a wide variety of DSLs, ranging from widely used languages for common domains, such as HTML for web pages, down to languages used by only one or a few pieces of software, such as MUSH soft code. DSLs can be further subdivided by the kind of language, and include domain-specific ''markup'' languages, domain-specific ''modeling'' languages (more generally, specification languages), and domain-specific ''programming'' languages. Special-purpose computer languages have always existed in the computer age, but the term "domain-specific language" has become more popular due to the rise of domain-specific modeling. Simpler DSLs, particularly ones used by a single application, are sometimes informally called mini-languages. The line between general-purpose languages and do ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Kernel Meta Meta Model
KM3 or Kernel Meta Meta Model is a neutral computer language to write metamodels and to define Domain Specific Languages. KM3 has been defined at INRIA and is available under the Eclipse platform. References KM3: a DSL for Metamodel SpecificationJouault, F, and Bézivin, J (2006). In: Proceedings of 8th IFIP International Conference on Formal Methods for Open Object-Based Distributed Systems, LNCS 4037, Bologna, Italy, pages 171-185. ADT DownloadEclipse GMT sitesoftwarefactories.com articleSoftmetaware.com article
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]