|International Institute of Social History|
|Data harmonization, Digital Humanities, Socio-Economic Data, Linked Data, Historical Demography, Data Curation|
|I am a postdoctoral researcher at the International Institute of Social History (IISG) in Amsterdam. My expertise lies in (historical) data harmonization, digital humanities, economics and Linked Data.
In this work package I am responsible for the collection, curation and harmonization of Macro, Meso and Micro datasets. My interests range from digital humanities, socio economic history, statistical data (especially historical censuses) to usability and Linked (Open) Data.
|Vrije Universiteit Amsterdam and University of Amsterdam|
|linked data, escience, digital humanities, data science, (semantic) web, knowledge representation|
|I am a researcher in the Knowledge Representation group of the AI department at the VU. My expertise lies in the area of using semantic technologies (such as linked data and semantic web) to support experts in science, humanities and government. This with a focus on usability, provenance and incentive building.|
|VU University Amsterdam, International Institute of Social History|
|Linked Data; Digital Humanities; Artificial Intelligence|
Albert Meroño is a postdoc at the Vrije Universiteit Amsterdam and the International Institute of Social History (IISG). He is currently working in WP4 of CLARIAH, which aims at facilitating the integration of socio-historical datasets using Web technology. Albert obtained his PhD in 2016 at the Vrije Universiteit Amsterdam, under the supervision of Frank van Harmelen, Stefan Schlobach, and Andrea Scharnhorst. He also holds a bachelor in Informatics Engineering from Universitat Politecnica de Catalunya (FIB-UPC), and has previously worked at the Insitute of Law and Technology (IDT-UAB) developing models for Law using Semantic Web technology. His research interests include Linked Open Data, Government and Statistical data, Artificial Intelligence, and Digital Humanities.
|Linked Data; Web Development; Large Scale Processing; APIs|
|Laurens is specialized in Web Engineering and large-scale Infrastructure Deployment. He obtained his PhD on scalable Linked Data solutions under the supervision of Frank van Harmelen. Laurens has built the most popular SPARQL editor YASGUI. He has also been responsible for the technical ecosystem surrounding the LOD Laundromat and has previously worked as Software Engineer at GfK.|
|Vrije Universiteit Amsterdam|
|http://datalegend.net/; user stories; data conversion and integration; data science; triple store management|
|My research objective is to automatically acquire relevant knowledge and share it in a way so that both humans and machines can make maximum use of it.|