Thank you for your participation in RIADD challenge. The competition linked with IEEE ISBI 2021 is now completed and the validation dataset has also been made publicly available. Researchers can now evaluate their models by submitting results on the test set.
The following rules are applicable to those who request to participate (individually or as a team) and download the data:
- All the information while registration as a team or individual must be complete and correct. Note: Anonymous registrations are not allowed.
Disclosure of the Competition Results:
- Evaluation of results uploaded (for any of the sub-challenge) to this website will be made publicly available on the leaderboard of this site, and by submitting results, you grant us permission to publish our evaluation.
Submission of Results and Paper:
- All participants are allowed to make 5 submissions per day.
- To participate in the final round of challenge it is mandatory to submit a 4-page paper in an appropriate ISBI style to the organizers in which you explain your algorithm.
Ownership of Algorithms:
- Participating teams maintain full ownership of their algorithms and associated intellectual property they develop in course of participating in the challenge.
- The top-performing participating teams and individuals (decided by the experts based on the performance of the method) will be invited to contribute to a joint journal paper(s) with a maximum of 2 authors per team (per sub-challenge) describing and summarizing the methods used and results found in this challenge. The paper will be submitted to a high-impact journal in the field.
- The organizers will review the paper for sufficient detail to be able to understand and reproduce the method and hold the right to exclude participants from the joint journal paper in case their method description was not adequate.
- An appropriate citation is to be made in scientific publications (journals, conference papers, technical reports, presentations at conferences and meetings) that use the data shared in this challenge. The citation must refer to the data descriptor and the publication that is an outcome of this challenge.
- Data Descriptor: Samiksha Pachade, Prasanna Porwal, Dhanshree Thulkar, Manesh Kokare, Girish Deshmukh, Vivek Sahasrabuddhe, Luca Giancardo, Gwenolé Quellec, and Fabrice Mériaudeau, 2021. Retinal Fundus Multi-Disease Image Dataset (RFMiD): A Dataset for Multi-Disease Detection Research. Data, 6(2), p.14. Available (Open Access): https://www.mdpi.com/2306-5729/6/2/14
- For all sub-challenges, participants may use other datasets for the development of a method that will be submitted to the challenge, provided that the datasets are publicly available and clearly stated in the submitted paper.
- For instance, Kaggle DR, IDRiD, Messidor, and APTOS datasets. This is a non-exhaustive list and addition is possible when the organizing team identifies (themselves/through interested participants) some other data source in due course of time by December 31, 2020, will be allowed to use for this challenge.
- Also, this challenge discourages participant grading on public data. i.e. though the teams are allowed to use public data they are allowed to do private annotations for some specific tasks and then train their models. If they do so, teams need to release their data and annotations to the public through this challenge before December 31, 2020, to have their performance counted using the private annotations.