Faculty of Computer Science and Engineering
Permanent URI for this communityhttps://repository.ukim.mk/handle/20.500.12188/5
The Faculty of Computer Science and Engineering (FCSE) within UKIM is the largest and most prestigious faculty in the field of computer science and technologies in Macedonia, and among the largest
faculties in that field in the region.
The FCSE teaching staff consists of 50 professors and 30 associates. These include many “best in field” personnel, such as the most referenced scientists in Macedonia and the most influential professors in the ICT industry in the Republic of Macedonia.
Browse
6 results
Search Results
- Some of the metrics are blocked by yourconsent settings
Item type:Publication, RDFGraphGen: An RDF Graph Generator Based on SHACL Shapes(Springer Nature (Singapore), 2026-04-01); ;Vecovska, Marija ;Jakubowski, MaximeHose, KatjaDeveloping and testing modern RDF-based applications often requires access to RDF datasets with certain characteristics. Unfortunately, it is very difficult to publicly find domain-specific knowledge graphs that conform to a particular set of characteristics. Hence, in this paper we propose RDFGraphGen, an open-source RDF graph generator that uses characteristics provided in the form of SHACL (Shapes Constraint Language) shapes to generate synthetic RDF graphs. RDFGraphGen is domain-agnostic, with configurable graph structure, value constraints, and distributions. It also comes with a number of predefined values for popular schema.org classes and properties, for more realistic graphs. Our results show that RDFGraphGen is scalable and can generate small, medium, and large RDF graphs in any domain. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, MOCHA 2017 as a Challenge for Virtuoso(Springer International Publishing, 2017-10) ;Spasić, MirkoThe Mighty Storage Challenge (MOCHA) aims to test the performance of solutions for SPARQL processing, in several aspects relevant for modern Linked Data applications. Virtuoso, by OpenLink Software, is a modern enterprise-grade solution for data access, integration, and relational database management, which provides a scalable RDF Quad Store. In this paper, we present a short overview of Virtuoso with a focus on RDF triple storage and SPARQL query execution. Furthermore, we showcase the final results of the MOCHA 2017 challenge and its tasks, along with a comparison between the performance of our system and the other participating systems. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, MOCHA2017: The Mighty Storage Challenge at ESWC 2017(Springer International Publishing, 2017-10) ;Georgala, Kleanthi ;Spasić, Mirko; ;Petzka, HenningRöder, MichaelThe aim of the Mighty Storage Challenge (MOCHA) at ESWC 2017 was to test the performance of solutions for SPARQL processing in aspects that are relevant for modern applications. These include ingesting data, answering queries on large datasets and serving as backend for applications driven by Linked Data. The challenge tested the systems against data derived from real applications and with realistic loads. An emphasis was put on dealing with data in form of streams or updates. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, Authorization Proxy for SPARQL Endpoints(Springer International Publishing, 2017-09); A large number of emerging services expose their data using various Application Programming Interfaces (APIs). Consuming and fusing data form various providers is a challenging task, since separate client implementation is usually required for each API. The Semantic Web provides a set of standards and mechanisms for unifying data representation on the Web, as well as means of uniform access via its query language – SPARQL. However, the lack of data protection mechanisms for the SPARQL query language and its HTTP-based data access protocol might be the main reason why it is not widely accepted as a data exchange and linking mechanism. This paper presents an authorization proxy that solves this problem using query interception and rewriting. For a given client, it solely returns the permitted data for the requested query, defined via a flexible policy language that combines the RDF and SPARQL standards for policy definition. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, Benchmarking Virtuoso 8 at the Mighty Storage Challenge 2018: Challenge Results(Springer International Publishing, 2018-10); Spasić, MirkoFollowing the success of Virtuoso at last year’s Mighty Storage Challenge - MOCHA 2017, we decided to participate once again and test the latest Virtuoso version against the new tasks which comprise the MOCHA 2018 challenge. The aim of the challenge is to test the performance of solutions for SPARQL processing in aspects relevant for modern applications: ingesting data, answering queries on large datasets and serving as backend for applications driven by Linked Data. The challenge tests the systems against data derived from real applications and with realistic loads, with an emphasis on dealing with changing data in the form of streams or updates. Virtuoso, by OpenLink Software, is a modern enterprise-grade solution for data access, integration, and relational database management, which provides a scalable RDF Quad Store. In this paper, we present the final challenge results from MOCHA 2018 for Virtuoso v8.0, compared to the other participating systems. Based on these results, Virtuoso v8.0 was declared as the overall winner of MOCHA 2018. - Some of the metrics are blocked by yourconsent settings
Item type:Publication, MOCHA2018: The Mighty Storage Challenge at ESWC 2018(Springer International Publishing, 2018-10) ;Georgala, Kleanthi ;Spasić, Mirko; ;Papakonstantinou, VassilisStadler, ClausThe aim of the Mighty Storage Challenge (MOCHA) at ESWC 2018 was to test the performance of solutions for SPARQL processing in aspects that are relevant for modern applications. These include ingesting data, answering queries on large datasets and serving as backend for applications driven by Linked Data. The challenge tested the systems against data derived from real applications and with realistic loads. An emphasis was put on dealing with data in form of streams or updates.
