RDF data extractors — Stack Overflow Dec 19, 2023 — In the article 'Building relational data from RDF triple pairs' I describe an RDF data extractor tool, and give some examples of how to build RDF data extractors based on an existing data store. Using RDF in the Cloud — Microsoft Dec 12, 2023 — In this article I describe how to use RDF in the cloud. RDF is a powerful tool that can be used in distributed computing environments. Dec 31, 2023 — In this post I describe the data models for the RDF Data Service and describe the application layer of the RDF Data Service. How a small Web app turns data from RDF triple pairs into CSV — Web App Central Sep 23, 2023 — To convert a document in an RDF triple pair to a data table in a relational database, you need to first convert the triple into a triple tree. The RDF triples in a data store — GitHub Sep 26, 2023 — I describe the steps we took to create the RDF Data Store. We had two data model stores: the first one used for all of our RDF triples, and the second one for RDF triple pairs. The RDF Data Store uses a combination of SQL and Apache Hive to store relational data. Dec 16, 2023 — Here's the data model for the RDF Data Service, which stores RDF triple pairs and makes them publicly available as a relational database on Amazon. Making it easy to understand RDF triples — Stack Overflow Dec 24, 2023 — This article describes how to build RDF triples, which is based on the triple store, and how to convert RDF triple pairs in this format into a table in a relational database. Using RDF in the Cloud — Microsoft Dec 4, 2023 — In this article I describe how RDF can be useful in the cloud. Microsoft is building an application programming interface that allows you to use RDF. This application is written in Java and is an implementation of an RDF database service for Java in the cloud. The application has four separate application components: a service, two Java-based client applications, and a Java API. Building RDF data store using Apache SQL Server — Web App Central Dec 10, 2023 — Let's use Apache Hadoop to implement the RDF Data Store.