herminiogg / dmaog   0.1.5

MIT License GitHub

Data Mapping Access Objects Generator

Scala versions: 3.x 2.13 2.12

Data Mapping Access Objects Generator (DMAOG)

Main build Maven Central

DMAOG is a library that allows to use data mapping rules to automatically generate code to access the data generated from these mapping rules. In this way a developer only has to write the mapping rules to generate a Knowledge Graph, then DMAOG will apply the mapping rules and generate the data access code necessary to query them. Additionally, DMAOG can take care of the data changes—and reapplying mapping rules—as a background task.

How it works

DMAOG is able to run mapping rules in ShExML and RML. After applying the mapping rules the data is explored through SPARQL queries in order to get the details of each type. It is important to remark that every desired data object to be generated must contain a type declaration in the form of :subject a :type. From the analysis DMAOG builds a set of Java classes that compounds the Data Access layer for the data generated by the mapping rules. This layer can be used in any Java-based project that uses DMAOG as a dependency. When accessing the data, DMAOG relies on reflection techniques to create and populate the objects that are then returned to the user through the created data access layer.

How to start

The DMAOG library is composed of two different entry points that are intended for the two steps described earlier: code generation and data access. The CLI is intended for the code generation from the command line in an easy and consistent way. Data access layer is used by the generated code to effectively encapsulate data query.

As an example we can use the mapping rules hosted in this project under the films.shexml file to generate the data access layer with the following command (see the CLI section to get further details about the configuration possibilities):

$ java -jar dmaog.jar -m films.shexml -o . -p com.herminiogarcia.dmaog

This command will create the following files: Actor.java, Film.java, ActorService.java, FilmService.java and DataAccessSingleton.java. Inside DataAccessSingleton.java all the configuration parameters can be set up. Then the services can be directly used, for example: new FilmService().getAll(); and the methods will return the results encapsulated in the generated DTO objects. To see all the methods supported by the services take a look to the "Supported methos in services" section.

CLI

Usage: dmaog [-hV] [--static] [-d=<datafile>] [-dr=<drivers>]
             [-m=<mappingRules>] [-ml=<mappingLanguage>] -o=<outputPath>
             -p=<packageName> [-ps=<password>] [-se=<sparqlEndpoint>]
             [-sep=<sparqlEndpointPassword>] [-seu=<sparqlEndpointUsername>]
             [-sl=<sparqlQueryLimit>] [-u=<username>]
Generate data access objects and services from your mapping rules.
  -d, --datafile=<datafile>
                  Path where the datafile is located if no mapping rules are
                    provided.
      -dr, --drivers=<drivers>
                  Drivers string in case it is not included in ShExML
  -h, --help      Show this help message and exit.
  -m, --mapping=<mappingRules>
                  Path to the file with the mappings
      -ml, --mappingLanguage=<mappingLanguage>
                  Mapping language to use: ShExML or RML
  -o, --output=<outputPath>
                  Path where to generate the output files
  -p, --package=<packageName>
                  Package information for the generated files
      -ps, --password=<password>
                  Password in case of database connection
      -se, --sparqlEndpoint=<sparqlEndpoint>
                  URL pointing to the SPARQL endpoint
      -sep, --sparqlEndpointPassword=<sparqlEndpointPassword>
                  Password for the SPARQL endpoint
      -seu, --sparqlEndpointUsername=<sparqlEndpointUsername>
                  Username for the SPARQL endpoint
      -sl, --sparqlQueryLimit=<sparqlQueryLimit>
                  Limit the amount of requested results by adding a LIMIT x
                    statement to the queries. This could limit performance
                    issues when dealing with big graphs

      --static    Exploit mapping rules without executing them
  -u, --username=<username>
                  Username in case of database connection
  -V, --version   Print version information and exit.

Supported features

  • Generation of data against files and SPARQL endpoints
  • Using already existing data files and SPARQL endpoints without mapping rules
  • Update actions on files and SPARQL endpoints (authentication included)
  • Static analysis of ShExML rules (not need to run the mapping rules to generate classes)
  • Multilingual strings
  • Pagination of the results

Supported methods in services

  • getAll(): List[Type] -> Returns all the results for the type
  • getAll(Long limit, Long offset): List[Type] -> Returns all the results for the type within the given page
  • getAll(String rdfFormat): String -> Returns all the results in the requested format
  • count(): Long -> Returns the total number of objects of this type
  • getById(String id): Type -> Return the object with the given id for the type. Take into account that the id is refering to the local part of the subject URI when talking about RDF data.
  • getById(String id, String rdfFormat): String -> Return the object with the given id in the requested format.
  • getByField(String fieldName, String value): List -> Return all the results whose indicated field value matches with the given value. Take into account that the fieldName refers to the localPart of the predicate URI when talking about RDF data.
  • getByField(String fieldName, String value, String rdfFormat): String -> Return all the results whose indicated field value matches with the given value in the requested format.
  • commit(Type instance) -> Deletes the instance (if it exists) in the data store and inserts the new data. It can act as create or update method.
  • delete(Type instance) -> Deletes the instance in the data store.

Future work

  • Handling Blank Nodes
  • Possibility to use Shapes for code generation
  • Static analysis of RML rules