What is Highlighter?

Logo large

Highlighter gives you the freedom to focus on machine learning, rather than the supporting software. We take care of the infrastructure, allowing you to develop and deploy custom machine learning models in an efficient and cost-effective way.

Highlighter provides a seamless interface to label images and train models. These models can then be deployed via an API for inference in production. Progressive improvement of the labelling processes is supported through a reporting and feedback loop.

Which machine learning project components does Highlighter support?

Highlighter currently supports the following project components:

  • Assignment of a Team
  • Selection of Models
  • Import and Annotation of Training-Data
  • Initiating Training
  • Integrating with Business Services via API
We are continuing to expand the functionality of the services provided.

Can I use Highlighter for video?

No, Highlighter does not currently support native video. However, you can convert video to frames manually, allowing you to use Highlighter to analyse video data. We are working on a solution to natively support video and hope to launch this later in the year.

Can I use Highlighter for text and audio projects?

No, currently Highlighter cannot be used for text or audio projects.

Does Highlighter have pre-trained models?

Highlighter does have some pre-trained models available. Please contact the Silverpond team for more information.

How is data exported?

Data can be exported through the queue system in JSON format or PASCAL VOC.

How does a team operate?

There are three roles within a team: owner, manager and labeller. When you create your Highlighter account, you become an owner, which allows you to create other users and manage billing. Managers can assign work, curate guests, supervise team contributions and create API access tokens. Labellers perform the annotation work to build the data set.

Can I monitor the labellers' work?

As a manager, you can monitor your labellers’ progress, spot check their work and compare their annotations with other labellers.

How do I manage my team?

The main activity involved with managing your team is creating team members with the correct roles and assigning them work via queues. Under your account-icon, you will find a link to “Manage Team".

Manage

This will allow the addition of new team members with desired roles. You can also send invitations and delete team-members. Note that if you are setting up an annotation project for the first time then you can add your team members through the new annotation-project wizard.

What is an Annotation?

Annotations are labels added to your images that denote information to certain image regions. This information is most commonly in the form of object-classes and metadata.

Object classes

What is an Object Class?

An object class is a label that you can use to tag a portion of an image with. The object-classes you set up and assign to projects will appear in the annotation user-interface for the labelling tool to tag images with.

Help icon

What is a Data Source?

A data source is the location of a collection of images, such as an AWS S3 Bucket. You can use a data-source to feed work to teams through image-queues. A queue will be automatically created for you when you set up a data-source with this wizard. Data-sources can be viewed and synced after creation.

Datasource

Data-sources can be viewed independently of projects in order to review the images referenced.

What is a Queue?

A queue is the mechanism via which work is assigned to team members as well as machines. A queue will allow you to create streams of images that can then be fanned out to various team members for annotation or quality assurance tasks or to machine learning workers in order to perform training tasks or inference.

Apples queue

How do you ensure quality control?

In order to add a quality control process to your project, you simply create and assign an additional queue. The queue takes as input the upstream data that should be quality checked and assigned to the team members who will be performing quality assurance.

These team members can then reject, flag or repair issues with existing annotations before sending them downstream on submission for additional quality control, or use in training or reporting.