An Evaluation queue allows for people to submit Synapse
Docker images, etc. for evaluation. They are designed to support open-access data analysis and modeling challenges in Synapse. This framework provides tools for administrators to collect and analyze data models from Synapse users created for a specific goal or purpose.
To create a queue, you must first create a Synapse
Project or have edit permissions on an existing Project. To create a Synapse Project, follow the instructions on the Project and Data Management page.
Once you’ve created your project, navigate to it and add
/challenge to the url (e.g. www.synapse.org/#!Synapse:syn12345/challenge). Click Tools in the right corner and select Add Evaluation Queue.
An Evaluation queue can take several parameters that you can use to customize your preferences.
Optionally, you can restrict how things are submitted by using a quota.
An Evaluation queue can only have one quota. You may specify the length of time the queue is open, the start date, round duration, and number of rounds. These are required parameters. It is optional to set submission limit.
Each Evaluation has sharing settings, which limit who can interact with the Evaluation.
To set the sharing settings, go to the Challenge tab to view your list of Evaluations. Click on the Share button per Evaluation and share it with the teams or individuals you would like.
While there isn’t technically a way of “closing” an evaluation queue, there are multiple ways to discontinue submissions for users.
can submitpermissions to it. If you have the ability to modify the permissions of a queue, you will still be able to submit to the queue due to your
Any Synapse entity may be submitted to an Evaluation Queue.
In the R and Python examples, you need to know the ID of the evaluation queue. This ID must be provided to you by administrators of the queue.
The submission function takes two optional parameters:
team. Name can be provided to customize the submission. The submission name is often used by participants to identify their submissions. If a name is not provided, the name of the entity being submitted will be used. As an example, if you submit a File named testfile.txt, and the name of the submission isn’t specified, it will default to testfile.txt. Team names can be provided to recognize a group of contributors.
import synapseclient syn = synapseclient.login() evaluation_id = "9610091" my_submission_entity = "syn1234567" submission = syn.submit( evaluation = evaluation_id, entity = my_submission_entity, name = "My Submission", # An arbitrary name for your submission team = "My Team Name") # Optional, can also pass a Team object or id
library(synapser) synLogin() evaluation_id <- "9610091" my_submission_entity <- "syn1234567" submission <- synSubmit( evaluation = evaluation_id, entity = my_submission_entity, name = "My Submission", # An arbitrary name for your submission team = "My Team Name") # Optional, can also pass a Team object or id
Every submission you make to an Evaluation queue has a unique ID. This ID should not be confused with Synapse IDs which start with syn. All submissions have a
Navigate to a file in Synapse and click on Tools in the upper right-hand corner. Select Submit To Challenge to pick the challenge for your submission. Follow the provided steps to complete your submission.
Submissions can be viewed through leaderboard widgets on Wiki pages.
Submission annotations can be added to a SubmissionStatus object to be displayed. Each of these annotations can be set to either public or private. Private annotations are not visible unless the team or Synapse user has Can Score or Admin permissions on the Evaluation queue. Public annotations can be viewed by any team or user that have Can View permissions.
Once you click on Leaderboard, you will have to input your own query statement such as
select * from evaluation_9610091. Remember, 9610091 should be replaced with your own Evaluation queue ID. To view all the columns available, click Refresh Columns.
Clicking Refresh Columns will add these default columns.
The appearance of columns in a leaderboard can be modified by changing the renderer used. You can change this by changing the value for the ‘Renderer’ attribute when configuring the leaderboard widget. These are the available renderers:
If you are happy with your leaderboard configurations, save both the configurations and the Wiki page to visualize these updates.
You may embed a
Submit To Evaluation widget on a Wiki page to improve visibility of your Evaluation queue. The widget allows participants to submit to multiple Evaluation queues within a Project or a single Evaluation queue.
Currently, this Wiki widget is required to submit Synapse Projects to an Evaluation queue. Synapse Docker repositories can not be submitted through this widget.
The “Evaluation Queue unavailable message” is customizable. A queue may appear unavailable to a user if:
To learn how to create a Wiki page, please visit the Wikis article.