Faculty of Computer Science and Engineering

Permanent URI for this communityhttps://repository.ukim.mk/handle/20.500.12188/5

The Faculty of Computer Science and Engineering (FCSE) within UKIM is the largest and most prestigious faculty in the field of computer science and technologies in Macedonia, and among the largest faculties in that field in the region. The FCSE teaching staff consists of 50 professors and 30 associates. These include many “best in field” personnel, such as the most referenced scientists in Macedonia and the most influential professors in the ICT industry in the Republic of Macedonia.

Browse

Search Results

Now showing 1 - 9 of 9
  • Some of the metrics are blocked by your 
    Item type:Publication,
    Temporal Authorization Graphs: Pros, Cons and Limits
    (Springer International Publishing, 2022-01)
    ;
    Popovski, Ognen
    ;
    ;
    ;
    As more private data is entering the web, defining authorization about its access is crucial for privacy protection. This paper proposes a policy language that leverages SPARQL expressiveness and popularity for flexible access control management and enforces the protection using temporal graphs. The temporal graphs are created during the authentication phase and are cached for further usage. They enable design-time policy testing and debugging, which is necessary for correctness guarantee. The security never comes with convenience, and this paper examines the environments in which the temporal graphs are suitable. Based on the evaluation results, an approximated function is defined for suitability determination based on the expected temporal graph size.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    MOCHA 2017 as a Challenge for Virtuoso
    (Springer International Publishing, 2017-10)
    Spasić, Mirko
    ;
    The Mighty Storage Challenge (MOCHA) aims to test the performance of solutions for SPARQL processing, in several aspects relevant for modern Linked Data applications. Virtuoso, by OpenLink Software, is a modern enterprise-grade solution for data access, integration, and relational database management, which provides a scalable RDF Quad Store. In this paper, we present a short overview of Virtuoso with a focus on RDF triple storage and SPARQL query execution. Furthermore, we showcase the final results of the MOCHA 2017 challenge and its tasks, along with a comparison between the performance of our system and the other participating systems.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    MOCHA2017: The Mighty Storage Challenge at ESWC 2017
    (Springer International Publishing, 2017-10)
    Georgala, Kleanthi
    ;
    Spasić, Mirko
    ;
    ;
    Petzka, Henning
    ;
    Röder, Michael
    The aim of the Mighty Storage Challenge (MOCHA) at ESWC 2017 was to test the performance of solutions for SPARQL processing in aspects that are relevant for modern applications. These include ingesting data, answering queries on large datasets and serving as backend for applications driven by Linked Data. The challenge tested the systems against data derived from real applications and with realistic loads. An emphasis was put on dealing with data in form of streams or updates.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    Authorization Proxy for SPARQL Endpoints
    (Springer International Publishing, 2017-09)
    ;
    A large number of emerging services expose their data using various Application Programming Interfaces (APIs). Consuming and fusing data form various providers is a challenging task, since separate client implementation is usually required for each API. The Semantic Web provides a set of standards and mechanisms for unifying data representation on the Web, as well as means of uniform access via its query language – SPARQL. However, the lack of data protection mechanisms for the SPARQL query language and its HTTP-based data access protocol might be the main reason why it is not widely accepted as a data exchange and linking mechanism. This paper presents an authorization proxy that solves this problem using query interception and rewriting. For a given client, it solely returns the permitted data for the requested query, defined via a flexible policy language that combines the RDF and SPARQL standards for policy definition.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    Benchmarking Virtuoso 8 at the Mighty Storage Challenge 2018: Challenge Results
    (Springer International Publishing, 2018-10)
    ;
    Spasić, Mirko
    Following the success of Virtuoso at last year’s Mighty Storage Challenge - MOCHA 2017, we decided to participate once again and test the latest Virtuoso version against the new tasks which comprise the MOCHA 2018 challenge. The aim of the challenge is to test the performance of solutions for SPARQL processing in aspects relevant for modern applications: ingesting data, answering queries on large datasets and serving as backend for applications driven by Linked Data. The challenge tests the systems against data derived from real applications and with realistic loads, with an emphasis on dealing with changing data in the form of streams or updates. Virtuoso, by OpenLink Software, is a modern enterprise-grade solution for data access, integration, and relational database management, which provides a scalable RDF Quad Store. In this paper, we present the final challenge results from MOCHA 2018 for Virtuoso v8.0, compared to the other participating systems. Based on these results, Virtuoso v8.0 was declared as the overall winner of MOCHA 2018.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    MOCHA2018: The Mighty Storage Challenge at ESWC 2018
    (Springer International Publishing, 2018-10)
    Georgala, Kleanthi
    ;
    Spasić, Mirko
    ;
    ;
    Papakonstantinou, Vassilis
    ;
    Stadler, Claus
    The aim of the Mighty Storage Challenge (MOCHA) at ESWC 2018 was to test the performance of solutions for SPARQL processing in aspects that are relevant for modern applications. These include ingesting data, answering queries on large datasets and serving as backend for applications driven by Linked Data. The challenge tested the systems against data derived from real applications and with realistic loads. An emphasis was put on dealing with data in form of streams or updates.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    Semantic Web and Data Science Integration Using Computational Books
    (Ss. Cyril and Methodius University in Skopje, Faculty of Computer Science and Engineering, Republic of North Macedonia, 2021-05)
    Mileski, Dimitar
    ;
    ;
    This paper presents the architecture for the development of web applications for exploring semantic knowledge graphs through parameterized interactive visualizations. The web interface and the interactive parameterized visualizations, in the form of a computational book, provide a way in which knowledge graphs can be explored. An important part of using this approach for building interactive web visualizations is that we can substitute the knowledge graph entities with other entities within the existing interactive visualizations, execute commands in a web-based environment, and get the same visualization for the new entities. With this architecture, various applications for interactive visualization of knowledge graphs can be developed, which can also stimulate the interest to explore the graph and its entities. We also present a publicly available open source use-case that is built using the concepts discussed in this paper.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    A GeoSPARQL Compliance Benchmark
    (MDPI, 2021-07-16)
    ;
    Homburg, Timo
    ;
    Spasić, Mirko
    GeoSPARQL is an important standard for the geospatial linked data community, given that it defines a vocabulary for representing geospatial data in RDF, defines an extension to SPARQL for processing geospatial data, and provides support for both qualitative and quantitative spatial reasoning. However, what the community is missing is a comprehensive and objective way to measure the extent of GeoSPARQL support in GeoSPARQL-enabled RDF triplestores. To fill this gap, we developed the GeoSPARQL compliance benchmark. We propose a series of tests that check for the compliance of RDF triplestores with the GeoSPARQL standard, in order to test how many of the requirements outlined in the standard a tested system supports. This topic is of concern because the support of GeoSPARQL varies greatly between different triplestore implementations, and the extent of support is of great importance for different users. In order to showcase the benchmark and its applicability, we present a comparison of the benchmark results of several triplestores, providing an insight into their current GeoSPARQL support and the overall GeoSPARQL support in the geospatial linked data domain.
  • Some of the metrics are blocked by your 
    Item type:Publication,
    Transforming Geospatial RDF Data into GeoSPARQL-Compliant Data: A Case of Traffic Data
    (Ss. Cyril and Methodius University in Skopje, Faculty of Computer Science and Engineering, Republic of North Macedonia, 2019-05)
    ;
    Spasić, Mirko
    Geospatial RDF datasets have a tendency to use latitude and longitude properties to denote the geographic location of the entities described within them. On the other hand, geographic information systems prefer the use of WKT and GML geometries when working with geospatial data. In this paper, we present a process of RDF data transformation which produces a GeoSPARQL-compliant dataset, using an RDF geospatial dataset with traffic data as a starting point. The traffic is comprised of vehicle traces, which consist of numerous points with specific latitude and longitude values. With our transformations, we enable querying of the dataset with GeoSPARQL extensions, which can be used to feed a GIS solution.