Information technology -- Coded representation of immersive media

This document specifies immersive media metrics and the measurement framework. The immersive media metrics can be collected by service providers and used to enhance the immersive media quality and experiences. This document also includes a client reference model with observation and measurement points for collection of the metrics.

Technologies de l'information -- Représentation codée de média immersifs

General Information

Status
Published
Publication Date
14-Jun-2021
Current Stage
5060 - Close of voting Proof returned by Secretariat
Start Date
13-May-2021
Completion Date
13-May-2021
Ref Project

Buy Standard

Standard
ISO/IEC 23090-6:2021 - Information technology -- Coded representation of immersive media
English language
16 pages
sale 15% off
Preview
sale 15% off
Preview
Draft
ISO/IEC PRF 23090-6:Version 18-apr-2021 - Information technology -- Coded representation of immersive media
English language
16 pages
sale 15% off
Preview
sale 15% off
Preview

Standards Content (Sample)

INTERNATIONAL ISO/IEC
STANDARD 23090-6
First edition
2021-06
Information technology — Coded
representation of immersive media —
Part 6:
Immersive media metrics
Reference number
ISO/IEC 23090-6:2021(E)
©
ISO/IEC 2021

---------------------- Page: 1 ----------------------
ISO/IEC 23090-6:2021(E)

COPYRIGHT PROTECTED DOCUMENT
© ISO/IEC 2021
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting
on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address
below or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
ii © ISO/IEC 2021 – All rights reserved

---------------------- Page: 2 ----------------------
ISO/IEC 23090-6:2021(E)

Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
4 Abbreviated terms . 1
5 Arithmetic operators and mathematical functions . 2
6 Immersive media metrics client reference model . 2
6.1 Overview . 2
6.2 Definition of observation points . 3
6.2.1 General. 3
6.2.2 Observation point 1 . 3
6.2.3 Observation point 2 . 4
6.2.4 Observation point 3 . 4
6.2.5 Observation point 4 . 4
6.2.6 Observation point 5 . 5
7 Metrics . 5
7.1 General . 5
7.2 Rendered FOV set metric . 6
7.3 Display information set metric . 6
7.4 Rendered viewports metric . 7
7.5 Comparable quality viewport switching latency metric . 7
8 Metric measurement process . 8
8.1 General . 8
8.2 Rendered viewport measurement . 8
8.3 Comparable quality viewport switching latency measurement . 9
Annex A (informative) Illustration of implementation .11
© ISO/IEC 2021 – All rights reserved iii

---------------------- Page: 3 ----------------------
ISO/IEC 23090-6:2021(E)

Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www .iso .org/ directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent
rights. Details of any patent rights identified during the development of the document will be in the
Introduction and/or on the ISO list of patent declarations received (see www .iso .org/ patents) or the IEC
list of patent declarations received (see https:// patents .iec .ch).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to the
World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT), see www .iso .org/
iso/ foreword .html.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
A list of all parts in the ISO/IEC 23090 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www .iso .org/ members .html.
iv © ISO/IEC 2021 – All rights reserved

---------------------- Page: 4 ----------------------
ISO/IEC 23090-6:2021(E)

Introduction
The immersive media metrics and measurement framework provide interoperability for consistent
logging and monitoring of immersive media quality and experiences.
Annex A provides an illustration of immersive media metrics measurement.
© ISO/IEC 2021 – All rights reserved v

---------------------- Page: 5 ----------------------
INTERNATIONAL STANDARD ISO/IEC 23090-6:2021(E)
Information technology — Coded representation of
immersive media —
Part 6:
Immersive media metrics
1 Scope
This document specifies immersive media metrics and the measurement framework. The immersive
media metrics can be collected by service providers and used to enhance the immersive media
quality and experiences. This document also includes a client reference model with observation and
measurement points for collection of the metrics.
2 Normative references
The following documents are referred to in the text in such a way that some or all of their content
constitutes requirements of this document. For dated references, only the edition cited applies. For
undated references, the latest edition of the referenced document (including any amendments) applies.
ISO 23009-1:2019, Information technology — Dynamic adaptive streaming over HTTP (DASH) — Part 1:
Media presentation description and segment formats
ISO 23090-2, Coded representation of immersive media — Part 2: Omnidirectional media format
3 Terms and definitions
No terms and definitions are listed in this document.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at https:// www .electropedia .org/
4 Abbreviated terms
2D two-dimensional
2DQR two-dimensional quality ranking
DASH dynamic adaptive streaming over http
ER effective resolution
ERT effective resolution threshold
FOV field of view
OMAF omnidirectional media format
MCR metrics computing and reporting
© ISO/IEC 2021 – All rights reserved 1

---------------------- Page: 6 ----------------------
ISO/IEC 23090-6:2021(E)

MPD media presentation description
OP observation point
PPI pixels per inch
QR quality ranking
QRT quality ranking threshold
SRQR spherical-region quality ranking
VR virtual reality
5 Arithmetic operators and mathematical functions
+ addition
− subtraction (as a two-argument operator) or negation (as a unary prefix operator)
* multiplication, including matrix multiplication
y
fi() summation of f( i ) with i taking all integer values from x up to and including y

ix=
6 Immersive media metrics client reference model
6.1 Overview
A generic immersive media client reference model is shown in Figure 1 with observation points for
metrics measurement. The model consists of key functional modules including network access, media
processing, sensor, media renderer, and immersive media application. A VR client may be an OMAF
player for file/segment reception or file access, file/segment decapsulation, decoding of audio, video,
or image bitstreams, audio and image rendering, and viewport selection. The metrics computing and
reporting (MCR) module queries the measurable data from various functional modules and calculates
the specified metrics. The MCR module may reside inside or outside of the VR client. The specified
metrics may then be reported to an analytics server or other entities interested and authorized to access
such metrics. The analytics server or other entities may use the metrics data to analyse the end user
experience, assess client device capabilities, and evaluate the immersive system performance in order
to enhance the overall immersive service experience across network, platform, device, applications and
services.
2 © ISO/IEC 2021 – All rights reserved

---------------------- Page: 7 ----------------------
ISO/IEC 23090-6:2021(E)

Key
1 network access 14 audio rendering
2 media processing 15 OP1
3 sensor 16 OP2
4 media renderer 17 OP3
5 immersive media application 18 OP4
6 MCR 19 OP5
7 immersive media metrics 20 media segment
8 immersive presentation 21 metadata
9 file/segment decapsulation 22 media track
10 video decoding 23 media data
11 image decoding 24 control/config
12 audio decoding 25 sensor data
13 image rendering
Figure 1 — Immersive media metrics client reference model
6.2 Definition of observation points
6.2.1 General
This clause defines the observation points as depicted in Figure 1.
6.2.2 Observation point 1
The network access module issues media file/segment requests and receives media files or segment
streams from the network. The interface from the network access element towards MCR is referred to
as observation point 1 (OP1). This observation point is equivalent to ISO/IEC 23009-1 observation point
1 as defined in ISO/IEC 23009-1:2019, D.3.2.
© ISO/IEC 2021 – All rights reserved 3

---------------------- Page: 8 ----------------------
ISO/IEC 23090-6:2021(E)

6.2.3 Observation point 2
The media processing module processes the file or the received media track, extracts the coded
bitstreams, parses the media and metadata, and decodes the media. The interface from the media
processing module towards MCR is referred to as observation point 2 (OP2).
The collectable data of OP2 includes parameters such as:
— MPD information, for example:
— media type;
— media codec;
— adaptation set, representation, and preselection IDs;
— OMAF metadata, for example:
— omnidirectional video projection;
— omnidirectional video region-wise packing;
— omnidirectional viewport;
— Other media metadata, for example:
— frame packing;
— colour space:
— dynamic range.
6.2.4 Observation point 3
The sensor module acquires the user’s viewing orientation, position and interaction. The interface from
the sensor towards MCR is referred to as observation point 3 (OP3). The sensor data may be used by
network access, media processing and media renderer module to retrieve, process and render VR media
elements. For example, the current viewing orientation may be determined by the head tracking and
possibly also eye tracking functionality. Besides being used by the renderer to render the appropriate
part of decoded video and audio signals, the current viewing orientation may also be used by the
network access for viewport dependent streaming and by the video and audio decoders for decoding
optimization.
OP3 for example provides information of collectable sensor data for:
— the centre point of the current viewport;
— head motion tracking;
— eye tracking.
6.2.5 Observation point 4
The media renderer module synchronizes and playbacks the different VR media components to provide
a fully immersive VR experience to the user. The decoded pictures are projected onto the screen of a
head-mounted display or any other display device based on the current viewing orientation or viewport
based on the metadata that includes information on region-wise packing, frame packing, projection,
and sphere rotation as defined in ISO/IEC 23090-2. Likewise, decoded audio is rendered (e.g. through
headphones) according to the current viewing orientation. The media renderer module may support
colour conversion, projection, and media composition for each VR media component. The interface from
the media renderer towards MCR is referred to as observation point 4 (OP4).
4 © ISO/IEC 2021 – All rights reserved

---------------------- Page: 9 ----------------------
ISO/IEC 23090-6:2021(E)

This observation point is equivalent to ISO/IEC 23009-1 observation point 3 as defined in
ISO/IEC 23009-1:2019, D.3.4.
The collectable data from OP4 may, for example, include:
— the media type:
— the media sample presentation timestamp:
— wall clock time:
— actual rendered viewport:
— actual media sample rendering time:
— actual rendering frame rate.
6.2.6 Observation point 5
The immersive media application manages the application configurations such as display resolution,
frame rate, field of view (FOV), lens separation distance, etc. The interface from the immersive media
application towards MCR is referred to as observation point 5 (OP5).
OP5 consists of client capability and configuration parameters, and the collectable data from OP5
includes, for example:
— display resolution;
— display density, in units of pixels per inch (PPI);
— horizontal and vertical FOV, in units of degrees;
— media format and codec support;
— OS support.
7 Metrics
7.1 General
This clause specifies specific immersive media metrics. The syntax for the DASH metrics as specified in
ISO/IEC 23009-1:2019, D.4.1 is used for immersive media metrics with the following addition:
A new data type, ViewportDataType, is defined as shown in Table 1. ViewportDataType is an object with
six integer keys that identify a viewport. The six keys are: viewpoint_id, centre_azimuth, centre_
elevation, centre_tilt, azimuth_range and elevation_range.
© ISO/IEC 2021 – All rights reserved 5

---------------------- Page: 10 ----------------------
ISO/IEC 23090-6:2021(E)

Table 1 — ViewportDataType
Key Type Description
ViewportDataType Object

viewpoint_id Integer
Specifies the identifier of the viewpoint to which the viewport
belongs.
centre_azimuth Integer
Specifies the azimuth of the centre of the viewport in units of
−16 16
2 degrees. The value shall be in the range of −180 * 2 to
16
180 * 2 − 1, inclusive.
centre_elevation Integer
Specifies the elevation of the centre of the viewport in units
−16 16
of 2 degrees. The value shall be in the range of −90 * 2 to
16
90 * 2 , inclusive.
−16
centre_tilt Integer
Specifies the tilt angle of the viewport in units of 2 degrees.
16 16
The value shall be in the range of −180 * 2 to 180 * 2 − 1,
inclusive.
azimuth_range Integer
Specifies the azimuth range of the viewport through the cen-
−16
tre point of the viewport, in units of 2 degrees.
elevation_range Integer
Specifies the elevation range of the viewport through the cen-
−16
tre point of the viewport, in units of 2 degrees.
7.2 Rendered FOV set metric
The RenderedFOVSet metric reports a set of FOVs rendered by VR client devices, as specified in Table 2.
Table 2 — RenderedFOVSet
Key Type Description
RenderedFovSet Set
set of rendered FOVs
Object
Entry
−16
renderedFovH Integer
 The horizontal element of the rendered FOV, in units of 2 de-
16
grees, the value shall be in the range of 0 to 360 * 2 , inclusive.
−16
renderedFovV Integer
 The vertical element of the rendered FOV, in units of 2 de-
16
grees, the value shall be in the range of 0 to 360 * 2 , inclusive.
7.3 Display information set metric
The DisplayInfoSet metric reports a set of display resolution, pixel density and refresh rate values
used by VR clients for rendering the VR video, as specified in Table 3.
Table 3 — DisplayInfoSet
Key Type Description
DisplayInfoSet Set
set of display information
Object
Entry
displayResolution String
 display
...

INTERNATIONAL ISO/IEC
STANDARD 23090-6
First edition
Information technology — Coded
representation of immersive media —
Part 6:
Immersive media metrics
PROOF/ÉPREUVE
Reference number
ISO/IEC 23090-6:2021(E)
©
ISO/IEC 2021

---------------------- Page: 1 ----------------------
ISO/IEC 23090-6:2021(E)

COPYRIGHT PROTECTED DOCUMENT
© ISO/IEC 2021
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting
on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address
below or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
ii PROOF/ÉPREUVE © ISO/IEC 2021 – All rights reserved

---------------------- Page: 2 ----------------------
ISO/IEC 23090-6:2021(E)

Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
4 Abbreviated terms . 1
5 Arithmetic operators and mathematical functions . 2
6 Immersive media metrics client reference model . 2
6.1 Overview . 2
6.2 Definition of observation points . 3
6.2.1 General. 3
6.2.2 Observation point 1 . 3
6.2.3 Observation point 2 . 4
6.2.4 Observation point 3 . 4
6.2.5 Observation point 4 . 4
6.2.6 Observation point 5 . 5
7 Metrics . 5
7.1 General . 5
7.2 Rendered FOV set metric . 6
7.3 Display information set metric . 6
7.4 Rendered viewports metric . 7
7.5 Comparable quality viewport switching latency metric . 7
8 Metric measurement process . 8
8.1 General . 8
8.2 Rendered viewport measurement . 8
8.3 Comparable quality viewport switching latency measurement . 9
Annex A (informative) Illustration of implementation .11
© ISO/IEC 2021 – All rights reserved PROOF/ÉPREUVE iii

---------------------- Page: 3 ----------------------
ISO/IEC 23090-6:2021(E)

Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that
are members of ISO or IEC participate in the development of International Standards through
technical committees established by the respective organization to deal with particular fields of
technical activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other
international organizations, governmental and non-governmental, in liaison with ISO and IEC, also
take part in the work.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www .iso .org/ directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent
rights. Details of any patent rights identified during the development of the document will be in the
Introduction and/or on the ISO list of patent declarations received (see www .iso .org/ patents) or the IEC
list of patent declarations received (see https:// patents .iec .ch).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to the
World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT), see www .iso .org/
iso/ foreword .html.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information].
A list of all parts in the ISO/IEC 23090 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www .iso .org/ members .html.
iv PROOF/ÉPREUVE © ISO/IEC 2021 – All rights reserved

---------------------- Page: 4 ----------------------
ISO/IEC 23090-6:2021(E)

Introduction
The immersive media metrics and measurement framework provide interoperability for consistent
logging and monitoring of immersive media quality and experiences.
Annex A provides an illustration of immersive media metrics measurement.
© ISO/IEC 2021 – All rights reserved PROOF/ÉPREUVE v

---------------------- Page: 5 ----------------------
INTERNATIONAL STANDARD ISO/IEC 23090-6:2021(E)
Information technology — Coded representation of
immersive media —
Part 6:
Immersive media metrics
1 Scope
This document specifies immersive media metrics and the measurement framework. The immersive
media metrics can be collected by service providers and used to enhance the immersive media
quality and experiences. This document also includes a client reference model with observation and
measurement points for collection of the metrics.
2 Normative references
The following documents are referred to in the text in such a way that some or all of their content
constitutes requirements of this document. For dated references, only the edition cited applies. For
undated references, the latest edition of the referenced document (including any amendments) applies.
ISO 23009-1:2019, Information technology — Dynamic adaptive streaming over HTTP (DASH) — Part 1:
Media presentation description and segment formats
ISO 23090-2, Coded representation of immersive media — Part 2: Omnidirectional media format
3 Terms and definitions
No terms and definitions are listed in this document.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at https:// www .electropedia .org/
4 Abbreviated terms
2D two-dimensional
2DQR two-dimensional quality ranking
DASH dynamic adaptive streaming over http
ER effective resolution
ERT effective resolution threshold
FOV field of view
OMAF omnidirectional media format
MCR metrics computing and reporting
© ISO/IEC 2021 – All rights reserved PROOF/ÉPREUVE 1

---------------------- Page: 6 ----------------------
ISO/IEC 23090-6:2021(E)

MPD media presentation description
OP observation point
PPI pixels per inch
QR quality ranking
QRT quality ranking threshold
SRQR spherical-region quality ranking
VR virtual reality
5 Arithmetic operators and mathematical functions
+ addition
− subtraction (as a two-argument operator) or negation (as a unary prefix operator)
* multiplication, including matrix multiplication
y
fi() summation of f( i ) with i taking all integer values from x up to and including y

ix=
6 Immersive media metrics client reference model
6.1 Overview
A generic immersive media client reference model is shown in Figure 1 with observation points for
metrics measurement. The model consists of key functional modules including network access, media
processing, sensor, media renderer, and immersive media application. A VR client may be an OMAF
player for file/segment reception or file access, file/segment decapsulation, decoding of audio, video,
or image bitstreams, audio and image rendering, and viewport selection. The metrics computing and
reporting (MCR) module queries the measurable data from various functional modules and calculates
the specified metrics. The MCR module may reside inside or outside of the VR client. The specified
metrics may then be reported to an analytics server or other entities interested and authorized to access
such metrics. The analytics server or other entities may use the metrics data to analyse the end user
experience, assess client device capabilities, and evaluate the immersive system performance in order
to enhance the overall immersive service experience across network, platform, device, applications and
services.
2 PROOF/ÉPREUVE © ISO/IEC 2021 – All rights reserved

---------------------- Page: 7 ----------------------
ISO/IEC 23090-6:2021(E)

Key
1 network access 14 audio rendering
2 media processing 15 OP1
3 sensor 16 OP2
4 media renderer 17 OP3
5 immersive media application 18 OP4
6 MCR 19 OP5
7 immersive media metrics 20 media segment
8 immersive presentation 21 metadata
9 file/segment decapsulation 22 media track
10 video decoding 23 media data
11 image decoding 24 control/config
12 audio decoding 25 sensor data
13 image rendering
Figure 1 — Immersive media metrics client reference model
6.2 Definition of observation points
6.2.1 General
This clause defines the observation points as depicted in Figure 1.
6.2.2 Observation point 1
The network access module issues media file/segment requests and receives media files or segment
streams from the network. The interface from the network access element towards MCR is referred to
as observation point 1 (OP1). This observation point is equivalent to ISO/IEC 23009-1 observation point
1 as defined in ISO/IEC 23009-1:2019, D.3.2.
© ISO/IEC 2021 – All rights reserved PROOF/ÉPREUVE 3

---------------------- Page: 8 ----------------------
ISO/IEC 23090-6:2021(E)

6.2.3 Observation point 2
The media processing module processes the file or the received media track, extracts the coded
bitstreams, parses the media and metadata, and decodes the media. The interface from the media
processing module towards MCR is referred to as observation point 2 (OP2).
The collectable data of OP2 includes parameters such as:
— MPD information, for example:
— media type;
— media codec;
— adaptation set, representation, and preselection IDs;
— OMAF metadata, for example:
— omnidirectional video projection;
— omnidirectional video region-wise packing;
— omnidirectional viewport;
— Other media metadata, for example:
— frame packing;
— colour space:
— dynamic range.
6.2.4 Observation point 3
The sensor module acquires the user’s viewing orientation, position and interaction. The interface from
the sensor towards MCR is referred to as observation point 3 (OP3). The sensor data may be used by
network access, media processing and media renderer module to retrieve, process and render VR media
elements. For example, the current viewing orientation may be determined by the head tracking and
possibly also eye tracking functionality. Besides being used by the renderer to render the appropriate
part of decoded video and audio signals, the current viewing orientation may also be used by the
network access for viewport dependent streaming and by the video and audio decoders for decoding
optimization.
OP3 for example provides information of collectable sensor data for:
— the centre point of the current viewport;
— head motion tracking;
— eye tracking.
6.2.5 Observation point 4
The media renderer module synchronizes and playbacks the different VR media components to provide
a fully immersive VR experience to the user. The decoded pictures are projected onto the screen of a
head-mounted display or any other display device based on the current viewing orientation or viewport
based on the metadata that includes information on region-wise packing, frame packing, projection,
and sphere rotation as defined in ISO/IEC 23090-2. Likewise, decoded audio is rendered (e.g. through
headphones) according to the current viewing orientation. The media renderer module may support
colour conversion, projection, and media composition for each VR media component. The interface from
the media renderer towards MCR is referred to as observation point 4 (OP4).
4 PROOF/ÉPREUVE © ISO/IEC 2021 – All rights reserved

---------------------- Page: 9 ----------------------
ISO/IEC 23090-6:2021(E)

This observation point is equivalent to ISO/IEC 23009-1 observation point 3 as defined in
ISO/IEC 23009-1:2019, D.3.4.
The collectable data from OP4 may, for example, include:
— the media type:
— the media sample presentation timestamp:
— wall clock time:
— actual rendered viewport:
— actual media sample rendering time:
— actual rendering frame rate.
6.2.6 Observation point 5
The immersive media application manages the application configurations such as display resolution,
frame rate, field of view (FOV), lens separation distance, etc. The interface from the immersive media
application towards MCR is referred to as observation point 5 (OP5).
OP5 consists of client capability and configuration parameters, and the collectable data from OP5
includes, for example:
— display resolution;
— display density, in units of pixels per inch (PPI);
— horizontal and vertical FOV, in units of degrees;
— media format and codec support;
— OS support.
7 Metrics
7.1 General
This clause specifies specific immersive media metrics. The syntax for the DASH metrics as specified in
ISO/IEC 23009-1:2019, D.4.1 is used for immersive media metrics with the following addition:
A new data type, ViewportDataType, is defined as shown in Table 1. ViewportDataType is an object with
six integer keys that identify a viewport. The six keys are: viewpoint_id, centre_azimuth, centre_
elevation, centre_tilt, azimuth_range and elevation_range.
© ISO/IEC 2021 – All rights reserved PROOF/ÉPREUVE 5

---------------------- Page: 10 ----------------------
ISO/IEC 23090-6:2021(E)

Table 1 — ViewportDataType
Key Type Description
ViewportDataType Object

viewpoint_id Integer
Specifies the identifier of the viewpoint to which the viewport
belongs.
centre_azimuth Integer
Specifies the azimuth of the centre of the viewport in units of
−16 16
2 degrees. The value shall be in the range of −180 * 2 to
16
180 * 2 − 1, inclusive.
centre_elevation Integer
Specifies the elevation of the centre of the viewport in units
−16 16
of 2 degrees. The value shall be in the range of −90 * 2 to
16
90 * 2 , inclusive.
−16
centre_tilt Integer
Specifies the tilt angle of the viewport in units of 2 degrees.
16 16
The value shall be in the range of −180 * 2 to 180 * 2 − 1,
inclusive.
azimuth_range Integer
Specifies the azimuth range of the viewport through the cen-
−16
tre point of the viewport, in units of 2 degrees.
elevation_range Integer
Specifies the elevation range of the viewport through the cen-
−16
tre point of the viewport, in units of 2 degrees.
7.2 Rendered FOV set metric
The RenderedFOVSet metric reports a set of FOVs rendered by VR client devices, as specified in Table 2.
Table 2 — RenderedFOVSet
Key Type Description
RenderedFovSet Set
set of rendered FOVs
Object
Entry
−16
renderedFovH Integer
 The horizontal element of the rendered FOV, in units of 2 de-
16
grees, the value shall be in the range of 0 to 360 * 2 , inclusive.
−16
renderedFovV Integer
 The vertical element of the rendered FOV, in units of 2 de-
16
grees, the value shall be in the range of 0 to 360 * 2 , inclusive.
7.3 Display information set metric
The DisplayInfoSet metric reports a set of display resolution, pixel density and refresh rate values
used by VR clients for rendering the VR video, as specified in Table 3.
Table 3 — DisplayInfoSet
Key Type Description
DisplayInfoSet Set
set of display information
Object
Entry
displayResolution String
 display resolution, in u
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.