The following tools can help you with assessing metadata, for example if it is in compliance with the FAIR principles or other metadata standards. Furthermore, published overviews and comparisons of the tools for FAIR data assessment give more detailed insights into the workings of the tools and can help you selecting the right one for your purpose.
The FAIR-Checker makes use of semantic web technologies to check if metadata is compliant with the FAIR principles. It was developed by the French Institute for Bioinformatics.
F-UJI FAIR Assessment assesses the FAIRness of research data objects (datasets) based on metrics developed by the FAIRsFAIR project. It only requires a PID or URL of the dataset which is to be assessed.
FAIR Enough checks if and how much online resources follow the FAIR principles. It is developed by the Institute of Data Science at Maastricht University. It too only requires a PID or URL of the dataset which is to be assessed.
The ARDC FAIR Data Self Assessment Tool assesses how FAIR your research dataset is based on a checklist and gives practical tips on how to enhance its FAIRness. It is developed by the Australian Research Data Commons (ARDC).
The FAIR Evaluation Services collect resources and guidelines to assess the FAIRness of digital resources. It focuses on maturity indicator tests. It is maintained by the FAIRmetrics and the FAIRsharing groups.
The AtMoDat Data Checker is a Python-based library that contains checks to ensure compliance with the AtMoDat Standard. It is based on the IOOS compliance checker and was developed for the climate research community.
The Hyve, a support portal for the life sciences, published an overview and evaluation of the aforementioned four FAIR data assessment tools (2023).
The EOSC FAIR-IMPACT project has also reviewed three of these tools, but with a focus on the application and potential repurposing to assess compliance with the FAIR for Research Software (FAIR4RS) principles (2024).