Notebooks

Calculate mean Intersection-Over-Union (mIOU) metric

iPython Project Created 4 days ago Free
Mean Intersection-Over-Union (mIOU) metric
Free Signup

Calculate mean Intersection-Over-Union (mIOU) metric

A ready-to-use script to find mean Intersection-Over-Union metric of class pairs

Input:

  • Existing Project (i.e. "london_roads")
  • At least one pair of classes (i.e. ("cargt", "carlb"))

Output:

  • intersection, union and IoU for each class pair

Imports

In [1]:
import supervisely_lib as sly
import os
import collections
from prettytable import PrettyTable
from tqdm import tqdm

Configuration

Edit the following settings for your own case

In [2]:
team_name = "jupyter_tutorials"
workspace_name = "metrics_tutorials"
project_name = "tutorial_metric_iou_project"

classes_mapping = {
    "dog": "annotator_dog",
    "person": "annotator_person",    
}

# Obtain server address and your api_token from environment variables
# Edit those values if you run this notebook on your own PC
address = os.environ['SERVER_ADDRESS']
token = os.environ['API_TOKEN']

Script setup

Import nessesary packages and initialize Supervisely API to remotely manage your projects

In [3]:
# Initialize API object
api = sly.Api(address, token)

Verify input values

Test that context (team / workspace / project) exists

In [4]:
team = api.team.get_info_by_name(team_name)
if team is None:
    raise RuntimeError("Team {!r} not found".format(team_name))

workspace = api.workspace.get_info_by_name(team.id, workspace_name)
if workspace is None:
    raise RuntimeError("Workspace {!r} not found".format(workspace_name))
    
project = api.project.get_info_by_name(workspace.id, project_name)
if project is None:
    raise RuntimeError("Project {!r} not found".format(project_name))
    
print("Team: id={}, name={}".format(team.id, team.name))
print("Workspace: id={}, name={}".format(workspace.id, workspace.name))
print("Project: id={}, name={}".format(project.id, project.name))
Out [4]:
Team: id=30, name=jupyter_tutorials
Workspace: id=78, name=metrics_tutorials
Project: id=930, name=tutorial_metric_iou_project

Get Project Meta of Source Project

Project Meta contains information about classes and tags# Get source project meta

In [5]:
meta_json = api.project.get_meta(project.id)
meta = sly.ProjectMeta.from_json(meta_json)

# check if all classes exist
project_classes_names = list(classes_mapping.keys()) + list(classes_mapping.values())

for class_name in project_classes_names:
    if class_name not in meta.obj_classes.keys():
        raise RuntimeError("Class {!r} not found in source project {!r}".format(class_name, project.name))

Create metric evaluator

In [6]:
metric_iou = sly.IoUMetric(classes_mapping)

Iterate over all images, and calculate metric by annotations pairs

In [7]:
 for dataset in api.dataset.get_list(project.id):
    
    # generate dataset name in destination project if it exists
    print("Processing: project = {!r}, dataset = {!r} \n".format(project.name, dataset.name))
    
    for image in tqdm(api.image.get_list(dataset.id)):

        # get image annotation
        ann_info = api.annotation.download(image.id)
        ann_json = ann_info.annotation
        ann = sly.Annotation.from_json(ann_json, meta)
        
        metric_iou.add_pair(ann, ann)
Out [7]:
 33%|███▎      | 1/3 [00:00<00:00,  9.04it/s]
Processing: project = 'tutorial_metric_iou_project', dataset = 'dataset_01' 

100%|██████████| 3/3 [00:00<00:00,  9.37it/s]
100%|██████████| 2/2 [00:00<00:00, 34.64it/s]
Processing: project = 'tutorial_metric_iou_project', dataset = 'dataset_02' 

Print results by default logger

In [8]:
metric_iou.log_total_metrics()
Out [8]:
{"message": "**************** Result IoU metric values ****************", "timestamp": "2019-04-03T07:26:31.142Z", "level": "info"}
{"message": "1. Classes dog <-> annotator_dog:   IoU = 1.000000,  mean intersection = 10211.800000, mean union = 10211.800000", "timestamp": "2019-04-03T07:26:31.147Z", "level": "info"}
{"message": "2. Classes person <-> annotator_person:   IoU = 1.000000,  mean intersection = 40750.000000, mean union = 40750.000000", "timestamp": "2019-04-03T07:26:31.152Z", "level": "info"}
{"message": "Total:   IoU = 1.000000,  mean intersection = 254809.000000, mean union = 254809.000000", "timestamp": "2019-04-03T07:26:31.155Z", "level": "info"}

Print results manually

In [14]:
results = metric_iou.get_metrics()
total_results = metric_iou.get_total_metrics()

table = PrettyTable(["classes pair", "metrics values"])

def build_values_text(values):
    values_text = ""
    for metrics_name, value in values.items():
        values_text += "{}: {}\n".format(metrics_name, value)
    return values_text
    
for first_pair_class, values in results.items():
    pair_text = "{} <-> {}".format(first_pair_class, classes_mapping[first_pair_class])
    table.add_row([pair_text, build_values_text(values)])

table.add_row(["TOTAL", build_values_text(total_results)])
print(table.get_string())
Out [14]:
+-----------------------------+-----------------------+
|         classes pair        |     metrics values    |
+-----------------------------+-----------------------+
|    dog <-> annotator_dog    | intersection: 10211.8 |
|                             |     union: 10211.8    |
|                             |        iou: 1.0       |
|                             |                       |
| person <-> annotator_person | intersection: 40750.0 |
|                             |     union: 40750.0    |
|                             |        iou: 1.0       |
|                             |                       |
|            TOTAL            |  intersection: 254809 |
|                             |     union: 254809     |
|                             |        iou: 1.0       |
|                             |                       |
+-----------------------------+-----------------------+

More Info

ID
23
First released
4 days ago
Last updated
3 hours ago

Owner

s