Internet Engineering Task Force (IETF) M. Bagnulo Request for Comments: 8911 UC3M Category: Standards Track B. Claise ISSN: 2070-1721Cisco Systems, Inc.Huawei P. Eardley BT A. Morton AT&T Labs A. Akhter ConsultantSeptember 2020November 2021 Registry for Performance Metrics Abstract This document defines the format for the IANA Registry of PerformanceMetrics Registry.Metrics. This document also gives a set of guidelines for Registered Performance Metric requesters and reviewers. Status of This Memo This is an Internet Standards Track document. This document is a product of the Internet Engineering Task Force (IETF). It represents the consensus of the IETF community. It has received public review and has been approved for publication by the Internet Engineering Steering Group (IESG). Further information on Internet Standards is available in Section 2 of RFC 7841. Information about the current status of this document, any errata, and how to provide feedback on it may be obtained at https://www.rfc-editor.org/info/rfc8911. Copyright Notice Copyright (c)20202021 IETF Trust and the persons identified as the document authors. All rights reserved. This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document. Code Components extracted from this document must includeSimplifiedRevised BSD License text as described in Section 4.e of the Trust Legal Provisions and are provided without warranty as described in theSimplifiedRevised BSD License. Table of Contents 1. Introduction 2. Terminology 3. Scope 4. Motivations for the Performance Metrics Registry 4.1. Interoperability 4.2. Single Point of Reference for Performance Metrics 4.3. Side Benefits 5. Criteria for Performance Metrics Registration 6. Performance Metrics Registry: Prior Attempt 6.1. Why This Attempt Should Succeed 7. Definition of the Performance Metrics Registry 7.1. Summary Category 7.1.1. Identifier 7.1.2. Name 7.1.3. URI 7.1.4. Description 7.1.5. Reference 7.1.6. Change Controller 7.1.7. Version (of Registry Format) 7.2. Metric Definition Category 7.2.1. Reference Definition 7.2.2. Fixed Parameters 7.3. Method of Measurement Category 7.3.1. Reference Method 7.3.2. Packet Stream Generation 7.3.3. Traffic Filter 7.3.4. Sampling Distribution 7.3.5. Runtime Parameters 7.3.6. Role 7.4. Output Category 7.4.1. Type 7.4.2. Reference Definition 7.4.3. Metric Units 7.4.4. Calibration 7.5. Administrative Information 7.5.1. Status 7.5.2. Requester 7.5.3. Revision 7.5.4. Revision Date 7.6. Comments and Remarks 8. Processes for Managing the Performance Metrics Registry Group 8.1. Adding New Performance Metrics to the Performance Metrics Registry 8.2.RevisingBackward-Compatible Revision of Registered Performance Metrics 8.3.DeprecatingNon-Backward-Compatible Deprecation of Registered Performance Metrics 8.4. Obsolete Registry Entries 8.5. Registry Format Version and Future Changes/Extensions 9. Security Considerations 10. IANA Considerations 10.1. Registry Group 10.2. Performance Metrics Name Elements 10.3. New Performance Metrics Registry 11. Blank Registry Template 11.1. Summary 11.1.1. ID (Identifier) 11.1.2. Name 11.1.3. URI 11.1.4. Description 11.1.5. Reference 11.1.6. Change Controller 11.1.7. Version (of Registry Format) 11.2. Metric Definition 11.2.1. Reference Definition 11.2.2. Fixed Parameters 11.3. Method of Measurement 11.3.1. Reference Method 11.3.2. Packet Stream Generation 11.3.3. Traffic Filtering (Observation) Details 11.3.4. Sampling Distribution 11.3.5. Runtime Parameters and Data Format 11.3.6. Roles 11.4. Output 11.4.1. Type 11.4.2. Reference Definition 11.4.3. Metric Units 11.4.4. Calibration 11.5. Administrative Items 11.5.1. Status 11.5.2. Requester 11.5.3. Revision 11.5.4. Revision Date 11.6. Comments and Remarks 12. References 12.1. Normative References 12.2. Informative References Acknowledgments Authors' Addresses 1. Introduction The IETF specifies and uses Performance Metrics of protocols and applications transported over its protocols. Performance Metrics are an important part of network operations using IETF protocols, and [RFC6390] specifies guidelines for their development. The definition and use of Performance Metrics in the IETF have been fostered in various working groups (WGs). Most notably: * The "IP Performance Metrics" (IPPM) WG is the WG primarily focusing on Performance Metrics definition at the IETF. * The "Benchmarking Methodology" WG (BMWG) defines many Performance Metrics for use in laboratory benchmarking of internetworking technologies. * The "Metric Blocks for use with RTCP's Extended Report Framework" (XRBLOCK) WG (concluded) specified many Performance Metrics related to "RTP Control Protocol Extended Reports (RTCP XR)" [RFC3611], which establishes a framework to allow new information to be conveyed in RTCP, supplementing the original report blocks defined in "RTP: A Transport Protocol for Real-Time Applications" [RFC3550]. * The "IP Flow Information eXport" (IPFIX) WG (concluded) specified an Internet Assigned Numbers Authority (IANA) process for new Information Elements. Some Information Elements related to Performance Metrics are proposed on a regular basis. * The "Performance Metrics for Other Layers" (PMOL) WG (concluded) defined some Performance Metrics related to Session Initiation Protocol (SIP) voice quality [RFC6035]. It is expected that more Performance Metrics will be defined in the future -- not only IP-based metrics but also metrics that are protocol specific and application specific. Despite the importance of Performance Metrics, there are two related problems for the industry: * First, ensuring that when one party requests that another party measure (or report or in some way act on) a particular Performance Metric, both parties have exactly the same understanding of what Performance Metric is being referred to. * Second, discovering which Performance Metrics have been specified, to avoid developing a new Performance Metric that is very similar but not quite interoperable. These problems can be addressed by creating aregistry ofRegistry for PerformanceMetrics. The usual way in which the IETF organizes registries isMetrics with theIANA, and there is currently no Performance Metrics Registry maintained byInternet Assigned Numbers Authority (IANA). As such, this document defines theIANA.new IANA Registry for Performance Metrics. Per this document, IANA has created and now maintains the Performance Metrics Registry, according to the maintenance procedures and the format defined in the sections below. The resulting Performance Metrics Registry is for use by the IETF and others. Although theregistryRegistry formatting specifications herein are primarily forregistryRegistry creation by IANA, any other organization that wishes to create a Performance Metrics Registry may use the same formatting specifications for their purposes. The authors make no guarantee of theregistryRegistry format's applicability to any possible set of Performance Metrics envisaged by other organizations, but we encourage others to apply it. In the remainder of this document, unless we explicitly say otherwise, we will refer to the IANA- maintained Performance Metrics Registry as simply the Performance Metrics Registry. 2. Terminology The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here. Performance Metric: A quantitative measure of performance, targeted to an IETF-specified protocol or targeted to an application transported over an IETF-specified protocol. Examples of Performance Metrics are the FTP response time for a complete file download, the DNSresponseResponse time to resolve the IP address(es), a database logging time, etc. This definition is consistent with the definition of a metric in [RFC2330] and broader than the definition of a Performance Metric in [RFC6390]. Registered Performance Metric: A Performance Metric expressed as an entry in the Performance Metrics Registry, administered by IANA. Such a Performance Metric has met all of theregistryRegistry review criteria defined in this document in order to be included in theregistry.Registry. Performance Metrics Registry: The IANAregistryRegistry containing Registered Performance Metrics. Proprietary Registry: A set of metrics that are registered in a proprietaryregistry,Registry, as opposed to the Performance Metrics Registry. Performance Metrics Experts: A group of designated experts [RFC8126] selected by the IESG to validate the Performance Metrics before updating the Performance Metrics Registry. The Performance Metrics Experts work closely with IANA. Parameter: An input factor defined as a variable in the definition of a Performance Metric. A Parameter is a numerical or other specified factor forming one of a set that defines a metric or sets the conditions of its operation. All Parameters must be known in order to make a measurement using a metric and interpret the results. There are two types of Parameters: Fixed and Runtime. For the Fixed Parameters, the value of the variable is specified in the Performance Metrics RegistryentryEntry and different Fixed Parameter values results in different Registered Performance Metrics. For the Runtime Parameters, the value of the variable is defined when the Metric Measurement Method is executed and a given Registered Performance Metric supports multiple values for the Parameter. Although Runtime Parameters do not change the fundamental nature of the Performance Metric's definition, some have substantial influence on the network property being assessed and interpretation of the results. | Note: Consider the case of packet loss in the following two | Active Measurement Method cases. The first case is packet loss | as background loss where the Runtime Parameter set includes a | very sparse Poisson stream and only characterizes the times | when packets were lost. Actual user streams likely see much | higher loss at these times, due to tail drop or radio errors. | The second case is packet loss ratio as the complimentary | probability of delivery ratio where the Runtime Parameter set | includes a very dense, bursty stream, and characterizes the | loss experienced by a stream that approximates a user stream. | These are both "Loss metrics", but the difference in | interpretation of the results is highly dependent on the | Runtime Parameters (at least), to the extreme where we are | actually using loss ratio to infer its complimentary | probability: delivery ratio. Active Measurement Methods: Methods of Measurement conducted on traffic that serves only the purpose of measurement and is generated for that reason alone, and whose traffic characteristics are known a priori. The complete definition of Active Methods is specified in Section 3.4 of [RFC7799]. Examples of Active Measurement Methods are the Measurement Methods for the one-way delay metric defined in [RFC7679] and the round-trip delay metric defined in [RFC2681]. Passive Measurement Methods: Methods of Measurement conducted on network traffic, generated by either (1) the end users or (2) network elements that would exist regardless of whether the measurement was being conducted or not. The complete definition of Passive Methods is specified in Section 3.6 of [RFC7799]. One characteristic of Passive Measurement Methods is that sensitive information may be observed and, as a consequence, stored in the measurement system. Hybrid Measurement Methods: Methods of Measurement that use a combination of Active Methods and Passive Methods, to assess Active Metrics, Passive Metrics, or new metrics derived from the a priori knowledge and observations of the stream of interest. The complete definition of Hybrid Methods is specified in Section 3.8 of [RFC7799]. 3. Scope This document is intended for two different audiences: 1. For thosedefining new Registeredpreparing a candidate PerformanceMetrics,Metric, it providesspecifications and best practices to be used in deciding which Registered Performance Metrics are useful for a measurement study,criteria that the proposal SHOULD meet (see Section 5). It also provides instructions for writing the text for each column of theRegisteredcandidate PerformanceMetrics,Metric andinformation onthesupporting documentationreferences required for the new Performance Metrics RegistryentryEntry (up to and including the publication of one or more immutable documents such as anRFC).RFC) (see Section 7). 2. For the appointed Performance Metrics Experts and for IANA personnel administering the new IANA Performance Metrics Registry, it defines a set of acceptance criteria against whichthese proposeda candidate Registered PerformanceMetricsMetric should beevaluated.evaluated, and requirements for the composition of a candidate Performance Metric Registry Entry. Other organizations that standardize performance metrics are encouraged to use the process defined in this memo to propose a candidate Registered Performance Metric. In addition, this document may be useful for other organizations who are defining a Performance Metrics Registry of their own and may reuse the features of the Performance Metrics Registry defined in this document. This Performance Metrics Registry is applicable to Performance Metricsissuedderived from Active Measurement, Passive Measurement, and any other form of Performance Metric. ThisregistryRegistry is designed to encompass Performance Metrics developed throughout the IETF and especially for the technologies specified in the following working groups: IPPM, XRBLOCK, IPFIX, and BMWG. This document analyzes a prior attempt to set up a Performance Metrics Registry and the reasons why this design was inadequate [RFC6248].Finally, this document gives a set of guidelines for requesters and Expert Reviewers of candidate Registered Performance Metrics. This document makes no attempt to populate[RFC8912] populates thePerformance Metrics Summarynew Registry withinitial entries; the related memo [RFC8912] definesthe initial set ofregistryentries. 4. Motivations for the Performance Metrics Registry In this section, we detail several motivations for the Performance Metrics Registry. 4.1. Interoperability As with any IETFregistry,Registry, the primary intention is to manage the registration of Identifiers for use within one or more protocols. In the particular case of the Performance Metrics Registry, there are two types of protocols that will use the Performance Metrics in the Performance Metrics Registry during their operation (by referring to the index values): Controlprotocol:Protocol: This type of protocol is used to allow one entity to request that another entity perform a measurement using a specific metric defined by the Performance Metrics Registry. One particular example is the Large-scale Measurement of Broadband Performance (LMAP) framework [RFC7594]. Using the LMAP terminology, the Performance Metrics Registry is used in the LMAP Control Protocol to allow a Controller to schedule ameasurement taskMeasurement Task for one or more Measurement Agents. In order to enable this use case, the entries in the Performance Metrics Registry must be sufficiently defined to allow a Measurement Agent implementation to trigger a specificmeasurement taskMeasurement Task upon the reception of acontrol protocolControl Protocol message. This requirement heavily constrains the types of entries that are acceptable for the Performance Metrics Registry. Reportprotocol:Protocol: This type of protocol is used to allow an entity to reportmeasurement resultsMeasurement Results to another entity. By referencing to a specific Registered PerformanceMetrics Registry,Metric, it is possible to properly characterize themeasurement resultMeasurement Result data being reported. Using the LMAP terminology, the Performance Metrics Registry is used in the LMAP Report Protocol to allow a Measurement Agent to reportmeasurement resultsMeasurement Results to a Collector. It should be noted that the LMAP framework explicitly allows for using not only the IANA-maintained Performance Metrics Registry but also other registries containing Performance Metrics, i.e., either (1) registries defined by other organizations or (2) private registries. However, others who are creating registries to be used in the context of an LMAP framework are encouraged to use theregistryRegistry format defined in this document, because this makes it easier for developers of LMAP Measurement Agents to programmatically use information found in those other registries' entries. 4.2. Single Point of Reference for Performance Metrics A Performance Metrics Registry serves as a single point of reference for Performance Metrics defined in different working groups in the IETF. As we mentioned earlier, there are several working groups that define Performance Metrics in the IETF, and it is hard to keep track of all of them. This results in multiple definitions of similar Performance Metrics that attempt to measure the same phenomena but in slightly different (and incompatible) ways. Having aregistryRegistry would allow the IETF community and others to have a single list of relevant Performance Metrics defined by the IETF (and others, where appropriate). The single list is also an essential aspect of communication about Performance Metrics, where different entities that request measurements, execute measurements, and report the results can benefit from a common understanding of the referenced Performance Metric. 4.3. Side Benefits There are a couple of side benefits of having such aregistry.Registry. First, the Performance Metrics Registry could serve as an inventory of useful and used Performance Metrics that are normally supported by different implementations of Measurement Agents. Second, the results of measurements using the Performance Metrics should be comparable even if they are performed by different implementations and in different networks, as the Performance Metric is properly defined. BCP 176 [RFC6576] examines whether the results produced by independent implementations are equivalent in the context of evaluating the completeness and clarity of metric specifications. [RFC6576] isanothera BCP [RFC2026] that defines the Standards Track advancement testing for (Active) IPPMmetrics,Metrics, and the same process will likely suffice to determine whether Registered Performance Metrics are sufficiently well specified to result in comparable (or equivalent) results. If a Registered PerformanceMetrics that haveMetric has undergone suchtestingtesting, this SHOULD benoted,noted in "Comments and Remarks" (see Section 7.6), with a reference to the test results. 5. Criteria for Performance Metrics Registration It is neither possible nor desirable to populate the Performance Metrics Registry with all combinations of Parameters of all Performance Metrics.TheA Registered PerformanceMetricsMetric SHOULD be: 1. Interpretable by the human user. 2. Implementable by the software or hardware designer. 3. Deployable by network operators. 4. Accurate in terms of producing equivalent results, and for interoperability and deployment across vendors. 5. Operationally useful, so that it has significant industry interest and/or has seen deployment. 6. Sufficiently tightly defined, so that different values for the Runtime Parameters do not change the fundamental nature of the measurement or change the practicality of its implementation. In essence, there needs to be evidence that (1) a candidate Registered Performance Metric has significant industry interest or has seen deployment and (2) there is agreement that the candidate Registered Performance Metric serves its intended purpose. 6. Performance Metrics Registry: Prior Attempt There was a previous attempt to define ametrics registryMetrics Registry [RFC4148]. However, it was obsoleted by [RFC6248] because it was "found to be insufficiently detailed to uniquely identify IPPM metrics... [there was too much] variability possible when characterizing a metric exactly", which led to the IPPM Metrics Registry defined in [RFC4148] having "very few users, if any." Three interesting additional quotes from [RFC6248] might help to understand the issues related to that registry. 1. "It is not believed to be feasible or even useful to register every possible combination of Type P, metric parameters, and Stream parameters using the current structure of the IPPM Metrics Registry." 2. "The current registry structure has been found to be insufficiently detailed to uniquely identify IPPM metrics." 3. "Despite apparent efforts to find current or even future users, no one responded to the call for interest in the RFC 4148 registry during the second half of 2010." The current approach learns from this by tightly defining each Registered Performance Metric with only a few variable (Runtime) Parameters to be specified by the measurement designer, if any. The idea is that entries in the Performance Metrics Registry stem from different Measurement Methods that require input (Runtime)parametersParameters to set factors like Source and Destination addresses (which do not change the fundamental nature of the measurement). The downside of this approach is that it could result in a large number of entries in the Performance Metrics Registry. There is agreement that less is more in this context -- it is better to have a reduced set of useful metrics rather than a large set of metrics, some with questionable usefulness. 6.1. Why This Attempt Should Succeed As mentioned in the previous section, one of the main issues with the previousregistryRegistry was that the metrics contained in theregistryRegistry were too generic to be useful. This document specifies stricter criteria for Performance Metric registration (see Section 5) and imposes a group of Performance Metrics Experts that will provide guidelines to assess if a Performance Metric is properly specified. Another key difference between this attempt and the previous one is that in this case there is at least one clear user for the Performance Metrics Registry: the LMAP framework and protocol. Because the LMAP protocol will use the Performance Metrics Registry values in its operation, this actually helps to determine if a metric is properly defined -- in particular, since we expect that the LMAP Control Protocol will enable a Controller to request that a Measurement Agent perform a measurement using a given metric by embedding the Performance Metrics Registry Identifier in the protocol. Such a metric and method are properly specified if they are defined well enough so that it is possible (and practical) to implement them in the Measurement Agent. This was the failure of the previous attempt: aregistry entryRegistry Entry with an undefined Type-P (Section 13 of [RFC2330]) allowsimplementationmeasurement results tobe ambiguous.vary significantly. 7. Definition of the Performance Metrics Registry This Performance Metrics Registry is applicable to Performance Metrics used for Active Measurement, Passive Measurement, and any other form of Performance Measurement. Each category of measurement has unique properties, so some of the columns defined below are not applicable for a given metric category. In this case, the column(s) SHOULD be populated with the "N/A" value (Not Applicable). However, the "N/A" value MUST NOT be used by any metric in the following columns: Identifier, Name, URI, Status, Requester, Revision, Revision Date, Description. In the future, a new category of metrics could require additional columns, and adding new columns is a recognized form ofregistryRegistry extension. The specification defining the new column(s) MUST give general guidelines for populating the new column(s) for existing entries. The columns of the Performance Metrics Registry are defined below. The columns are grouped into "Categories" to facilitate the use of theregistry.Registry. Categories are described at the "Section 7.x" heading level, and columns are described at the "Section 7.x.y" heading level. The figure below illustrates this organization. An entry (row) therefore gives a complete description of a Registered Performance Metric. Each column serves as a checklist item and helps to avoid omissions during registration and Expert Review [RFC8126]. Registry Categories and Columns are shown below in this format: Category ------------------... Column | Column |... Summary ---------------------------------------------------------------- Identifier | Name | URI | Desc. | Reference | Change | Ver | | | | | | Controller | Metric Definition ----------------------------------------- Reference Definition | Fixed Parameters | Method of Measurement --------------------------------------------------------------------- Reference | Packet | Traffic | Sampling | Runtime | Role | Method | Stream | Filter | Distribution | Parameters | | | Generation | Output ----------------------------------------- Type | Reference | Units | Calibration | | Definition | | | Administrative Information ------------------------------------- Status |Requester | Rev | Rev. Date | Comments and Remarks -------------------- There is a blank template of theregistryRegistry template provided in Section 11 of this memo. 7.1. Summary Category 7.1.1. Identifier Thisentrycolumn provides a numeric Identifier for the Registered Performance Metric.ThisThe Identifier of each Registered Performance Metric MUST beunique withinunique. Note that revising a Metric according to the process in Section 8.2 creates a new entry in the Performance MetricsRegistry.Registry with the same identifier. The Registered Performance Metric unique Identifier is an unbounded integer (range 0 to infinity). The Identifier 0 should be Reserved. The Identifier values from 64512 to6553665535 are reserved for private or experimental use, and the user may encounter overlapping uses. When adding new Registered Performance Metrics to the Performance Metrics Registry, IANA SHOULD assign the lowest available Identifier to the new Registered Performance Metric. If a Performance Metrics Expert providing review determines that there is a reason to assign a specific numeric Identifier, possibly leaving a temporary gap in the numbering, then the Performance Metrics Expert SHALL inform IANA of this decision. 7.1.2. Name As the Name of a Registered Performance Metric is the first thing a potential human implementer will use when determining whether it is suitable for their measurement study, it is important to be as precise and descriptive as possible. In the future, users will review the Names to determine if the metric they want to measure has already been registered, or if a similar entry is available, as a basis for creating a new entry. Names are composed of the following elements, separated by an underscore character "_": MetricType_Method_SubTypeMethod_... Spec_Units_Output MetricType: A combination of the directional properties and the metric measured, such as and not limited to:+===========+======================================+ | Name | Description | +===========+======================================++-----------+--------------------------------------+ | RTDelay |Round TripRound-Trip Delay | +-----------+--------------------------------------+ | RTDNS | Response Time Domain Name Service | +-----------+--------------------------------------+ | RLDNS | Response Loss Domain Name Service | +-----------+--------------------------------------+ | OWDelay | One-Way Delay | +-----------+--------------------------------------+ | RTLoss | Round-Trip Loss | +-----------+--------------------------------------+ | OWLoss | One-Way Loss | +-----------+--------------------------------------+ | OWPDV | One-Way Packet Delay Variation | +-----------+--------------------------------------+ | OWIPDV | One-Way Inter-Packet Delay Variation | +-----------+--------------------------------------+ | OWReorder | One-Way Packet Reordering | +-----------+--------------------------------------+ | OWDuplic | One-Way Packet Duplication | +-----------+--------------------------------------+ | OWBTC | One-Way Bulk Transport Capacity | +-----------+--------------------------------------+ | OWMBM | One-Way Model-Based Metric | +-----------+--------------------------------------+ | SPMonitor | Single-Point Monitor | +-----------+--------------------------------------+ | MPMonitor | Multi-Point Monitor | +-----------+--------------------------------------+ Table 1 Method: One of the methods defined in [RFC7799], such as and not limited to:+=============+==============================================+ | Name | Description | +=============+==============================================++-------------+----------------------------------------------+ | Active | depends on a dedicated measurement packet | | | stream and observations of the stream as | | | described in [RFC7799] | +-------------+----------------------------------------------+ | Passive | depends *solely* on observation of one or | | | more existing packet streams as described in | | | [RFC7799] | +-------------+----------------------------------------------+ | HybridType1 | Hybrid Type I observations on one stream | | | that combine both Active Methods and Passive | | | Methods as described in[RFC7799])[RFC7799] | +-------------+----------------------------------------------+ | HybridType2 | Hybrid Type II observations on two or more | | | streams that combine both Active Methods and | | | Passive Methods as described in [RFC7799] | +-------------+----------------------------------------------+ | Spatial | spatial metric as described in [RFC5644] | +-------------+----------------------------------------------+ Table 2 SubTypeMethod: One or more subtypes to further describe the features of the entry, such as and not limited to:+================+================================================+ | Name | Description | +================+================================================++----------------+------------------------------------------------+ | ICMP | Internet Control Message Protocol | +----------------+------------------------------------------------+ | IP | Internet Protocol | +----------------+------------------------------------------------+ | DSCPxx | where xx is replaced by a Diffserv code point | +----------------+------------------------------------------------+ | UDP | User Datagram Protocol | +----------------+------------------------------------------------+ | TCP | Transport Control Protocol | +----------------+------------------------------------------------+ | QUIC | QUIC transport protocol | +----------------+------------------------------------------------+ | HS | Hand-Shake, such as TCP's 3-way HS | +----------------+------------------------------------------------+ | Poisson | packet generation using Poisson distribution | +----------------+------------------------------------------------+ | Periodic | periodic packet generation | +----------------+------------------------------------------------+ | SendOnRcv | sender keeps one packet in transit by sending | | | when previous packet arrives | +----------------+------------------------------------------------+ | PayloadxxxxB | where xxxx is replaced by an integer, the | | | number of octets or 8-bit Bytes in the Payload | +----------------+------------------------------------------------+ | SustainedBurst | capacity test, worst case | +----------------+------------------------------------------------+ | StandingQueue | test of bottleneck queue behavior | +----------------+------------------------------------------------+ Table 3 SubTypeMethod values are separated by a hyphen "-" character, which indicates that they belong to this element and that their order is unimportant when considering Name uniqueness. Spec: An immutable document Identifier combined with a document section Identifier. For RFCs, this consists of the RFC number and major section number that specifies thisregistry entryRegistry Entry in the form "RFCXXXXsecY", e.g., RFC7799sec3. Note: The RFC number is not the primary reference specification for the metric definition (e.g., [RFC7679] as the primary reference specification for one- way delay metrics); it will contain the placeholder "RFCXXXXsecY" until the RFC number is assigned to the specifying document and would remain blank inprivate registry entriesPrivate Registry Entries without a corresponding RFC. Anticipating the "RFC10K" problem, the number of the RFC continues to replace "RFCXXXX", regardless of the number of digits in the RFC number. Anticipatingregistry entriesRegistry Entries from other standards bodies, the form of this Name Element MUST be proposed and reviewed for consistency and uniqueness by the Expert Reviewer. Units: The units of measurement for the output, such as and not limited to:+============+============================+ | Name | Description | +============+============================++------------+----------------------------+ | Seconds | | +------------+----------------------------+ | Ratio | unitless | +------------+----------------------------+ | Percent | value multiplied by 100% | +------------+----------------------------+ | Logical | 1 or 0 | +------------+----------------------------+ | Packets | | +------------+----------------------------+ | BPS | bits per second | +------------+----------------------------+ | PPS | packets per second | +------------+----------------------------+ | EventTotal | for unitless counts | +------------+----------------------------+ | Multiple | more than one type of unit | +------------+----------------------------+ | Enumerated | a list of outcomes | +------------+----------------------------+ | Unitless | | +------------+----------------------------+ Table 4 Output: The type of output resulting from measurement, such as and not limited to:+==============+====================================+ | Name | Description | +==============+====================================++--------------+------------------------------------+ | Singleton | | +--------------+------------------------------------+ | Raw | multiple singletons | +--------------+------------------------------------+ | Count | | +--------------+------------------------------------+ | Minimum | | +--------------+------------------------------------+ | Maximum | | +--------------+------------------------------------+ | Median | | +--------------+------------------------------------+ | Mean | | +--------------+------------------------------------+ | 95Percentile | 95th percentile | +--------------+------------------------------------+ | 99Percentile | 99th percentile | +--------------+------------------------------------+ | StdDev | standard deviation | +--------------+------------------------------------+ | Variance | | +--------------+------------------------------------+ | PFI | pass, fail, inconclusive | +--------------+------------------------------------+ | FlowRecords | descriptions of flows observed | +--------------+------------------------------------+ | LossRatio | lost packets to total packets, <=1 | +--------------+------------------------------------+ Table 5 An example, as described in Section 4 of [RFC8912], isRTDelay_Active_IP-UDP-Periodic_RFCXXXXsecY_Seconds_95PercentileRTDelay_Active_IP-UDP-Periodic_RFC8912sec4_Seconds_95Percentile Note that private registries following the format described here SHOULD use the prefix "Priv_" on any Name to avoid unintended conflicts (further considerations are described in Section 10). Privateregistry entriesRegistry Entries usually have no specifying RFC; thus, the Spec: element has no clear interpretation. 7.1.3. URI The URI column MUST contain a URL [RFC3986] that uniquely identifies and locates themetric entryMetric Entry so it is accessible through the Internet. The URL points to a file containing all of the human- readable information for oneregistry entry.Registry Entry. The URL SHALL reference a target file that is preferably HTML-formatted and contains URLs to referenced sections of HTMLized RFCs, or other reference specifications. These target files for different entries can be more easily edited and reused when preparing new entries. The exact form of the URL for each target file, and the target file itself, will be determined by IANA and reside on <https://www.iana.org/>. Section 4 of [RFC8912], as well as subsequent major sections of that document, provide an example of a target file in HTML form. 7.1.4. Description A Registered Performance Metric description is a written representation of a particular Performance Metrics Registryentry.Entry. It supplements the Registered Performance Metric Name to help Performance Metrics Registry users select relevant Registered Performance Metrics. 7.1.5. Reference This entry gives the specification containing the candidateregistry entryRegistry Entry that was reviewed and agreed upon, if such an RFC or other specification exists. 7.1.6. Change Controller This entry names the entity responsible for approving revisions to theregistry entryRegistry Entry and SHALL provide contact information (for an individual, where appropriate). 7.1.7. Version (of Registry Format) Thisentrycolumn gives the version number for theregistryRegistry format used, at the time the Performance Metric is registered. The formatused. Formatscomplying with this memo MUST use 1.0.The version number SHALL NOT change unless aA new RFCis publishedthat changes theregistryRegistry format will designate a new version number corresponding to that format. The version number ofregistry entriesRegistry Entries SHALL NOT change unless theregistry entryRegistry Entry is updated to reflect the Registry format (following the procedures in Section 8). 7.2. Metric Definition Category This category includes columns to prompt all necessary details related to the metric definition, including the immutable document reference and values of input factors, called"fixed parameters","Fixed Parameters", which are left open in the immutable document but have a particular value defined by the Performance Metric. 7.2.1. Reference Definition This entry provides a reference (or references) to the relevant sections of the document or documents that define the metric, as well as any supplemental information needed to ensure an unambiguous definition for implementations. A given reference needs to be an immutable document, such as an RFC; for other standards bodies, it is likely to be necessary to reference a specific, dated version of a specification. 7.2.2. Fixed Parameters Fixed Parameters are Parameters whose values must be specified in the Performance Metrics Registry. The measurement system uses these values. Where referenced metrics supply a list of Parameters as part of their descriptive template, a subset of the Parameters will be designated as Fixed Parameters. As an example for Active Metrics, Fixed Parameters determine most or all of the IPPM framework convention "packets of Type-P" as described in [RFC2330], such as transport protocol, payload length, TTL, etc. An example for Passive Metrics is for an RTP packet loss calculation that relies on the validation of a packet as RTP, which is a multi-packet validation controlled by the MIN_SEQUENTIAL variable as defined by [RFC3550]. Varying MIN_SEQUENTIAL values can alter the loss report, and this variable could be set as a Fixed Parameter. Parameters MUST have well-definedNames.names. For human readers, the hanging-indent style is preferred, and any ParameterNamesnames and definitions that do not appear in the Reference Method Specification MUST appear in this column (or the Runtime Parameters column). Parameters MUST have a well-specified data format. A Parameter that is a Fixed Parameter for one Performance Metrics RegistryentryEntry may be designated as a Runtime Parameter for another Performance Metrics Registryentry.Entry. 7.3. Method of Measurement Category This category includes columns for references to relevant sections of the immutable document(s) and any supplemental information needed to ensure an unambiguous method for implementations. 7.3.1. Reference Method This entry provides references to relevant sections of immutable documents, such as RFC(s) (for other standards bodies, it is likely to be necessary to reference a specific, dated version of a specification) describing the Method of Measurement, as well as any supplemental information needed to ensure unambiguous interpretation for implementations referring to the immutable document text. Specifically, this section should include pointers to pseudocode or actual code that could be used for an unambiguous implementation. 7.3.2. Packet Stream Generation This column applies to Performance Metrics that generate traffic as part of their Measurement Method, including, but not necessarily limited to, Active Metrics. The generated traffic is referred to as a "stream", and this column describes its characteristics. Each entry for this column contains the following information: Value: The name of the packet stream scheduling discipline Reference: The specification where the Parameters of the stream are defined The packet generation stream may requireparametersParameters such as the average packet rate and distribution truncation value for streams with Poisson-distributed inter-packet sending times. If suchparametersParameters are needed, they should be included in either the Fixed Parameters column or the Runtime Parameters column, depending on whether they will be fixed or will be an input for the metric. The simplest example of stream specification is singleton scheduling (see [RFC2330]), where a single atomic measurement is conducted. Each atomic measurement could consist of sending a single packet (such as a DNS request) or sending several packets (for example, to request a web page). Other streams support a series of atomic measurementsin a "sample", withusing pairs of packets, where the packet stream follows a schedule defining the timing betweeneachtransmittedpacketpackets, andsubsequent measurement.an atomic measurement assesses the reception time between successive packets (e.g., a measurement of Inter-Packet Delay Variation). More complex streams and measurement relationships are possible. Principally, two different streams are used in IPPMmetrics:Metrics: (1) Poisson, distributed as described in [RFC2330] and (2) periodic, as described in [RFC3432]. Both Poisson and periodic have their own uniqueparameters,Parameters, and the relevant set of ParameterNamesnames and values should be included in either the Fixed Parameters column or the Runtime Parameters column. 7.3.3. Traffic Filter This column applies to Performance Metrics that observe packets flowing through (the device with) the Measurement Agent, i.e., packets that are not necessarily addressed to the Measurement Agent. This includes, but is not limited to, Passive Metrics. The filter specifies the traffic that is measured. This includes protocol field values/ranges, such as address ranges, and flow or session Identifiers. The Traffic Filter itself depends on the needs of the metric itself and a balance of an operator's measurement needs and a user's need for privacy. Mechanics for conveying the filter criteria might be the BPF (Berkeley Packet Filter) or PSAMP (Packet Sampling) [RFC5475] Property Match Filtering, which reuses IPFIX [RFC7012]. An example BPF string for matching TCP/80 traffic to remote Destination net 192.0.2.0/24 would be "dst net 192.0.2.0/24 and tcp dst port 80".Filter engines that are moreMore complexmight be supported by the implementation that mightfilter engines may allow for matching using Deep Packet Inspection (DPI) technology. The Traffic Filter includes the following information: Type: The type of Traffic Filter used, e.g., BPF, PSAMP, OpenFlow rule, etc., as defined by a normative reference Value: The actual set of rules expressed 7.3.4. Sampling Distribution The sampling distribution defines, out of all of the packets that match the Traffic Filter, which one or more of those packets are actually used for the measurement. One possibility is "all", which implies that all packets matching the Traffic Filter are considered, but there may be other sampling strategies. It includes the following information: Value: The name of the sampling distribution Reference definition: Pointer to the specification where the sampling distribution is properly defined The sampling distribution may requireparameters.Parameters. If suchparametersParameters are needed, they should be included in either the Fixed Parameters column or the Runtime Parameters column, depending on whether they will be fixed or will be an input for the metric. PSAMP is documented in "Sampling and Filtering Techniques for IP Packet Selection" [RFC5475], while "A Framework for Packet Selection and Reporting" [RFC5474] provides more background information. The sampling distributionparametersParameters might be expressed in terms of the model described in "Information Model for Packet Sampling Exports" [RFC5477] and the process provided in "Flow Selection Techniques" [RFC7014]. 7.3.5. Runtime Parameters In contrast to the Fixed Parameters, Runtime Parameters are Parameters thatmust be determined, configured intodo not change the fundamental nature of the measurementsystem,andreported with the results for the context to be complete. However, thetheir valuesof these parametersare not specified in the Performance MetricsRegistry (like the Fixed Parameters); rather, these parametersRegistry. They arelistedleft as variables in the Registry, as an aid to the measurement system implementer oruser (they must be left as variables, anduser. Their values are supplied onexecution).execution, configured into the measurement system, and reported with the Measurement Results (so that the context is complete). Where metrics supply a list of Parameters as part of their descriptive template, a subset of the Parameters will be designated as Runtime Parameters. Parameters MUST have well-definedNames.names. For human readers, the hanging-indent style is preferred, and theNamesnames and definitions that do not appear in the Reference Method Specification MUST appear in this column. A data format for each Runtime Parameter MUST be specified in this column, to simplify the control and implementation of measurement devices. For example,parametersParameters that include an IPv4 address can be encoded as a 32-bit integer (i.e., a binary base64-encoded value) or "ip-address" as defined in [RFC6991]. The actual encoding(s) used must be explicitly defined for each Runtimeparameter.Parameter. IPv6 addresses and options MUST be accommodated, allowing Registered Performance Metrics to be used in that address family. Other address families are permissible. Examples of Runtime Parameters include IP addresses, measurement point designations, start times and end times for measurement, and other information essential to the Method of Measurement. 7.3.6. Role In some Methods of Measurement, there may be severalrolesRoles defined, e.g., for a one-way packet delay Active Measurement, there is one Measurement Agent that generates the packets and another Agent that receives the packets. This column contains the name of the Role(s) for this particular entry. In the one-way delay example above, there should be two entries in theregistry'sRegistry's Role column, one for each Role (Source and Destination). When a Measurement Agent is instructed to perform the "Source" Role for the one-way delay metric, the Agent knows that it is required to generate packets. The values for this field are defined in thereferenceReference Method of Measurement (and this frequently results in abbreviatedroleRole names such as "Src"). When the Role column of aregistry entryRegistry Entry defines more than one Role, the Role SHALL be treated as a Runtime Parameter and supplied for execution. It should be noted that the LMAP framework [RFC7594] distinguishes the Role from other Runtime Parameters. 7.4. Output Category For entries that involve a stream and many singleton measurements, a statistic may be specified in this column to summarize the results to a single value. If the complete set of measured singletons is output, this will be specified here. Some metrics embed one specific statistic in the reference metric definition, while others allow several output types or statistics. 7.4.1. Type This column contains the name of the output type. The output type defines a single type of result that the metric produces. It can be the raw results (packet send times and singleton metrics), or it can be a summary statistic. The specification of the output type MUST define the format of the output. In some systems, format specifications will simplify both measurement implementation and collection/storage tasks. Note that if two different statistics are required from a single measurement (for example, both "Xth percentile mean" and "Raw"), then a new output type must be defined ("Xth percentile mean AND Raw"). See Section 7.1.2 above for a list of output types. 7.4.2. Reference Definition This column contains a pointer to the specification(s) where the output type and format are defined. 7.4.3. Metric Units The measured results must be expressed using some standard dimension or units of measure. This column provides the units. When a sample of singletons (see Section 11 of [RFC2330] for definitions of these terms) is collected, this entry will specify the units for each measured value. 7.4.4. Calibration Some specifications for Methods of Measurement include the ability to perform an error calibration. Section 3.7.3 of [RFC7679] is one example. In theregistry entry,Registry Entry, this field will identify a method of calibration for the metric, and, when available, the measurement system SHOULD perform the calibration when requested and produce the output with an indication that it is the result of a calibration method.In situIn-situ calibration could be enabled with an internal loopback that includes as much of the measurement system as possible, performs address manipulation as needed, and provides some form of isolation (e.g., deterministic delay) to avoid send-receive interface contention. Some portion of the random and systematic error can be characterized in this way. For one-way delay measurements, the error calibration must include an assessment of the internal clock synchronization with its external reference (this internal clock is supplying timestamps for measurement). In practice, the time offsets of clocks at both the Source and Destination are needed to estimate the systematic error due to imperfect clock synchronization (the time offsets are smoothed; thus, the random variation is not usually represented in the results). Both internal loopback calibration and clock synchronization can be used to estimate the *available accuracy* of the Output Metric Units. For example, repeated loopback delay measurements will reveal the portion of the output result resolution that is the result of system noise and is thus inaccurate. 7.5. Administrative Information 7.5.1. Status This entry indicates the status of the specification of this Registered Performance Metric. Allowed values are'Current''Current', 'Deprecated', and'Deprecated'.'Obsolete'. All newly definedInformation ElementsRegistered Performance Metrics have 'Current'status.Status. 7.5.2. Requester This entry indicates the requester for the Registered Performance Metric. The requester MAY be a document (such as an RFC) or a person. 7.5.3. Revision This entry indicates the revision number of a Registered Performance Metric, starting at 0 for Registered Performance Metrics at the time of definition and incremented by one for each revision. However, in the case of a non-backward-compatible revision, see Section 8.3. 7.5.4. Revision Date This entry indicates the date of acceptance of the most recent revision for the Registered Performance Metric. The date SHALL be determined by IANA and the reviewing Performance Metrics Expert. 7.6. Comments and Remarks Besides providing additional details that do not appear in other categories, this open category (single column) allows unforeseen issues to be addressed by simply updating this informational entry. 8. Processes for Managing the Performance Metrics Registry Group Once a Performance Metric or set of Performance Metrics has been identified for a given application, candidate Performance Metrics RegistryentryEntry specifications prepared in accordance with Section 7 should be submitted to IANA to follow the process for review by the Performance Metrics Experts, as defined below. This process is also used for other changes tothea Performance MetricsRegistry,Registry Entry, such as deprecation or revision, as described later in this section. It is desirable that the author(s) of a candidate Performance Metrics RegistryentryEntry seek review in the relevant IETF working group or offer the opportunity for review on the working group mailing list. 8.1. Adding New Performance Metrics to the Performance Metrics Registry Requests to add Registered Performance Metrics in the Performance Metrics Registry SHALL be submitted to IANA, which forwards the request to a designated group of experts (Performance Metrics Experts) appointed by the IESG; these are the reviewers called for by the Specification Required policy [RFC8126] defined for the Performance Metrics Registry. The Performance Metrics Experts review the request for such things as compliance with this document, compliance with other applicable Performance Metrics-related RFCs, and consistency with the currently defined set of Registered Performance Metrics. The most efficient path for submission begins with preparation of an Internet-Draft containing the proposed Performance Metrics RegistryentryEntry using the template in Section 11, so that the submission formatting will benefit from the normal IETF Internet-Draft submission processing (including HTMLization). Submission to IANA may be during IESG review (leading to IETF Standards Action), where an Internet-Draft proposes one or more Registered Performance Metrics to be added to the Performance Metrics Registry, including the text of the proposed Registered Performance Metric(s). If an RFC-to-be includes a Performance Metric and a proposed Performance Metrics RegistryentryEntry but the Performance Metrics Expert's review determines that one or more of the criteria listed in Section 5 have not been met, then the proposed Performance Metrics RegistryentryEntry MUST be removed from the text. Once evidence exists that the Performance Metric meets the criteria in Section 5, the proposed Performance Metrics RegistryentryEntry SHOULD be submitted to IANA to be evaluated in consultation with the Performance Metrics Experts for registration at that time. Authors of proposed Registered Performance Metrics SHOULD review compliance with the specifications in this document to check their submissions before sending them to IANA. At least one Performance Metrics Expert should endeavor to complete referred reviews in a timely manner. If the request is acceptable, the Performance Metrics Experts signify their approval to IANA, and IANA updates the Performance Metrics Registry. If the request is not acceptable, the Performance Metrics Experts MAY coordinate with the requester to change the request so that it is compliant; otherwise, IANA SHALL coordinate resolution of issues on behalf of the expert. The Performance Metrics Experts MAY choose to reject clearly frivolous or inappropriate change requests outright, but such exceptional circumstances should be rare.This process should not in any way be construed as allowingIf thePerformance Metrics Expertsproposed Metric is unique in a significant way, in order tooverrule IETF consensus. Specifically, any Registered Performance Metrics that were addedproperly describe the Metric, it may be necessary to propose a new Name Element Registry, or (more likely) a new Entry in an existing Name Element Registry. This proposal is part of thePerformance Metrics Registry with IETF consensus require IETF consensusrequest forrevision or deprecation.the new Metric, so that it undergoes the same IANA review and approval process. Decisions by the Performance Metrics Experts may be appealed per Section 10 of [RFC8126]. 8.2.RevisingBackward-Compatible Revision of Registered Performance Metrics A request for revision is only permitted when the requested changes maintain backward compatibility with implementations of the prior Performance Metrics RegistryentryEntry describing a Registered Performance Metric (entries with lower revision numbers but having the same Identifier and Name). The purpose of the Status field in the Performance Metrics Registry is to indicate whether the entry for a Registered Performance Metric is'Current''Current', 'Deprecated', or'Deprecated'.'Obsolete'. The term 'deprecated' is used when an entry is replaced, either with a backwards-compatible revision (this sub-section) or with a non-backwards-compatible revision (in Section 8.3). In addition, no policy is defined for revising the Performancemetric entriesMetric Entries in the IANAregistryRegistry or addressing errors therein. To be clear, changes and deprecations within the Performance Metrics Registry are not encouraged and should be avoided to the extent possible. However, in recognition that change is inevitable, the provisions of this section address the need for revisions. Revisions are initiated by sending a candidate Registered Performance Metric definition to IANA, per Section 8.1, identifying the existing Performance Metrics Registryentry,Entry, and explaining how and why the existing entry should be revised. The primary requirement in the definition of procedures for managing changes to existing Registered Performance Metrics is avoidance of measurement interoperability problems; the Performance Metrics Experts must work to maintain interoperability above all else. Changes to Registered Performance Metrics may only be done in an interoperable way; necessary changes that cannot be done in a way that allows interoperability with unchanged implementations MUST result in the creation of a new Registered Performance Metric (with a new Name, replacing the RFCXXXXsecY portion of the Name) and possibly the deprecation of the earlier metric. A change to a Registered Performance Metric SHALL be determined to be backward compatible when: 1. it involves the correction of an error that is obviously only editorial, or 2. it corrects an ambiguity in the Registered Performance Metric's definition, which itself leads to issues severe enough to prevent the Registered Performance Metric's usage as originally defined, or 3. it corrects missing information in the metric definition without changing its meaning (e.g., the explicit definition of 'quantity' semantics for numeric fields without a Data Type Semantics value), or 4. it harmonizes with an external reference that was itselfcorrected.corrected, or 5. if the current Registry format has been revised by adding a new column that is not relevant to an existing Registered Performance Metric (i.e., the new column can be safely filled in with "Not Applicable"). If a Performance Metric revision is deemed permissible and backward compatible by the Performance Metrics Experts, according to the rules in this document, IANA SHOULD execute the change(s) in the Performance Metrics Registry. The requester of the change is appended to the original requester in the Performance Metrics Registry. The Name of the revised Registered Performance Metric, including the RFCXXXXsecY portion of the Name, SHALL remain unchanged even when the change is the result of IETF Standards Action. The revisedregistry entryRegistry Entry SHOULD reference the new immutable document, such as an RFC. For other standards bodies, it is likely to be necessary to reference a specific, dated version of a specification, in an appropriate category and column. Each Registered Performance Metric in the Performance Metrics Registry has a revision number, starting at zero. Each change to a Registered Performance Metric following this process increments the revision number by one. When a revised Registered Performance Metric is accepted into the Performance Metrics Registry, the date of acceptance of the most recent revision is placed into the Revision Date column of theregistryRegistry for that Registered Performance Metric. Where applicable, additions to Registered Performance Metrics in the form of text in the Comments or Remarks column should include the date, but such additions may not constitute a revision according to this process. Older versions of the updatedmetric entriesMetric Entries are kept in theregistryRegistry for archival purposes. The older entries are kept with all fields unmodified(Version,(including Revision Date) except for the Status field, which SHALL be changed to 'Deprecated'.8.3. DeprecatingThis process should not in any way be construed as allowing the Performance Metrics Experts to overrule IETF consensus. Specifically, any Registered Performance MetricsChangesthatare not permissible bywere added to theabove criteriaPerformance Metrics Registry with IETF consensus require IETF consensus for revision or deprecation. 8.3. Non-Backward-Compatible Deprecation of Registered Performance Metrics This section describes how to make a non-backward-compatible update to a Registered PerformanceMetric's revision may only be handled by deprecation.Metric. A Registered Performance Metric MAY be deprecated and replaced when: 1. the Registered Performance Metric definition has an error or shortcoming that cannot be permissibly changed per Section 8.2 ("Revising Registered Performance Metrics"), or 2. the deprecation harmonizes with an external reference that was itself deprecated through that reference's accepted deprecation method. A request for deprecation is sent to IANA, which passes it to the Performance Metrics Experts for review. When deprecating a Performance Metric, the Performance MetricdescriptionDescription in the Performance Metrics RegistrymustMUST be updated to explain the deprecation, as well as to refer toanythe new PerformanceMetricsMetric created to replace the deprecated Performance Metric.TheWhen a new, non-backward-compatible Performance Metric replaces a (now) deprecated metric, the revision number ofathe new Registered Performance Metric is incrementedupon deprecation,over the value in the deprecated version, and theRevision Datecurrent date isupdated,entered aswith any revision.the Revision Date of the new Registered Performance Metric. The intentional use of deprecated Registered Performance Metrics should result in a log entry or human-readable warning by the respective application. Names and Metric IDs of deprecated Registered Performance Metrics must not be reused. The deprecated entries are kept with allfieldsAdministrative columns unmodified, except theVersion field,Status field (which is changed to 'Deprecated'). 8.4. Obsolete Registry Entries Existing Registry Entries may become obsolete over time due to: 1. theRevision Date field,Registered Performance Metric is found to contain considerable errors (and no one sees the value in the effort to fix it), or 2. one or more critical References (or sections thereof) have been designated obsolete by the SDO, or 3. other reasons brought to the attention of IANA and the Registry Experts. When a Performance Metric Registry Entry is declared obsolete, the Performance Metric Description in the Performance Metrics Registry is updated to explain the reasons the Entry is now obsolete and has not been replaced (Deprecation always involves replacement). Obsolete entries are kept with all Administrative columns unmodified, except the Status field (which is changed to'Deprecated').'Obsolete'). 8.5. Registry Format Version and Future Changes/Extensions The Registry Format Version defined in this memo is 1.0, and candidate Registry Entries complying with this memo MUST use 1.0. The Registry Format can only be updated by publishing a new RFC with the new format (Standards Action). When a Registered Performance Metric is created or revised, then it uses the most recent Registry Format Version. Only one form of Registry extension is envisaged: Adding columns, or both categories and columns, to accommodate unanticipated aspects of new measurements and metric categories. If the Performance Metrics Registry is extended in this way, the version number of future entries complying with the extension SHALL be incremented (in either the unit or the tenths digit, depending on the degree of extension). 9. Security Considerations This document defines aregistryRegistry structure and does not itself introduce any new security considerations for the Internet. The definition of Performance Metrics for thisregistryRegistry may introduce some security concerns, but the mandatory references should have their own considerations for security, and such definitions should be reviewed with security in mind if the security considerations are not covered by one or more reference standards. The aggregated results of the Performance Metrics described in thisregistryRegistry might reveal network topology information that may be considered sensitive. If such cases are found, then access control mechanisms should be applied. 10. IANA Considerations With the background and processes described in earlier sections, IANA has taken thefollowing IANA actions.actions described below. 10.1. Registry Group The newregistryRegistry groupSHALL beis named"PERFORMANCE METRICS Group".Performance Metrics. This document refers to it as the "Performance Metrics Group" or "Registry Group", meaning all registrations appearing on <https://www.iana.org/assignments/performance-metrics> (https://www.iana.org/assignments/performance-metrics). For clarity, note that this document and [RFC8912] use the following conventions to refer to the various IANA registries related to Performance Metrics. +===============+===========================+=====================+ | | RFC 8911 and RFC 8912 | IANA Web page | +===============+===========================+=====================+ | Page Title | Performance Metrics Group | Performance Metrics | +---------------+---------------------------+---------------------+ | Main Registry | Performance Metrics | Performance Metrics | | | Registry | Registry | +---------------+---------------------------+---------------------+ | Registry Row | Performance Metrics | registration (also | | | Registry Entry | template) | +---------------+---------------------------+---------------------+ Table 6 Registration Procedure: Specification Required Reference: RFC 8911 Experts: Performance Metrics ExpertsNote: TBD10.2. Performance Metrics Name Elements Thisdocumentmemo specifies and populates the Registries for the Performance Metric Name Elements. The Name assigned to a Performance Metric Registry Entry consists of multiple ElementsRegistries.separated by an "_" (underscore), in the order defined in Section 7.1.2. IANA has created thethefollowing registries, which contain the current set of possibilities for each Element in the PerformanceMetrics Registry entry names.MetricTypeName. MetricType Method SubTypeMethod Spec Units OutputTo populateAt creation, IANA has populated the Registered Performance Metrics Name Elementsat creation, IANA is asked to useusing the lists of values for each Name Element listed in Section 7.1.2. The Name Elements in eachregistryRegistry are case sensitive. When preparing ametric entryMetric Entry for registration, the developer SHOULD choose Name Elements from among the registered elements. However, if the proposed metric is unique in a significant way, it may be necessary to propose a new Name Element to properly describe the metric, as described below. A candidatemetric entry RFC or immutable documentMetric Entry proposes a set of values for its Name Elements. These are reviewed by IANA and an ExpertReview would propose one or moreReviewer. It is possible that a candidate Metric Entry proposes a newelement values required to describe the unique entry, andvalue for a Name Element (that is, one that is not in the existing list of possibilities), or even that it proposes a new NameElement(s) would be reviewed along with the metric entry. NewElement. Such new assignmentsfor Registered Performance Metrics Name Elements will beare administered by IANA through the Specification Required policy[RFC8126] (which[RFC8126], which includes ExpertReview, i.e.,Review (i.e., review by one of a group ofexperts -- in the case of this document, thePerformance Metrics Experts, who are appointed by the IESG upon recommendation of the Transport Area Directors). 10.3. New Performance Metrics Registry This document specifies the Performance MetricsSummaryRegistry. TheregistryRegistry contains the following columns in the Summarycolumns: Identifier: Name: URI: Description: Reference:category: Identifier Name URI Description Reference ChangeController: Version:Controller Version Descriptions of these columns and additional information found in the template forregistry entriesRegistry Entries (categories and columns) are further defined in Section 7. The Identifier 0 should be Reserved. The Registered Performance Metric unique Identifier is an unbounded integer (range 0 to infinity). The Identifier values from 64512 to6553665535 are reserved for private or experimental use, and the user may encounter overlapping uses. When adding new Registered Performance Metrics to the Performance MetricsSummaryRegistry, IANA SHOULD assign the lowest available Identifier to the new Registered Performance Metric. If a Performance Metrics Expert providing review determines that there is a reason to assign a specific numeric Identifier, possibly leaving a temporary gap in the numbering, then the Performance Metrics Expert SHALL inform IANA of this decision. Names starting with the prefix "Priv_" are reserved for private use and are not considered for registration. The Name column entries are further defined in Section 7. The URI column will have a URL tothe full template ofeachregistry entry.completed Registry Entry. Theregistry entryRegistry Entry text SHALL be HTMLized to aid thereader, with links to reference RFCsreader (similar to the way that Internet-Drafts are HTMLized, the same tool can perform thefunction) orfunction), with links to referenced section(s) of an RFC or another immutable document. The Reference column will include an RFC number, an approved specification designator from another standards body, or some other immutable document. New assignments for the Performance MetricsSummaryRegistry will be administered by IANA through the Specification Required policy [RFC8126] (which includes Expert Review, i.e., review by one of a group of experts -- in the case of this document, the Performance Metrics Experts, who are appointed by the IESG upon recommendation of the Transport Area Directors) or by Standards Action. The experts can be initially drawn from the Working Group Chairs, document editors, and members of the Performance Metrics Directorate, among other sources of experts. Extensions to the Performance MetricsSummaryRegistry require IETF Standards Action. Only one form ofregistryRegistry extension is envisaged: * Adding columns, or both categories and columns, to accommodate unanticipated aspects of new measurements and metric categories. If the Performance MetricsSummaryRegistry is extended in this way, the version number of future entries complying with the extension SHALL be incremented (in either the unit or the tenths digit, depending on the degree of extension). 11. Blank Registry Template This section provides a blank template to help IANA andregistry entryRegistry Entry writers. 11.1. Summary This category includes multiple indexes to theregistry entry:Registry Entry: the element ID andmetricMetric Name. 11.1.1. ID (Identifier) <insert a numeric Identifier, an integer, TBD> 11.1.2. Name <insert the Name, according to the metric naming convention> 11.1.3. URI URL:https://www.iana.org/https://www.iana.org/performance-metrics/ ... <Name> 11.1.4. Description <provide a description> 11.1.5. Reference <provide the RFC or other specification that contains the approved candidateregistry entry>Registry Entry> 11.1.6. Change Controller <provide information regarding the entity responsible for approving revisions to theregistry entryRegistry Entry (including contact information for an individual, where appropriate)> 11.1.7. Version (of Registry Format) 11.2. Metric Definition This category includes columns to prompt the entry of all necessary details related to the metric definition, including the immutable document reference and values of input factors, called"fixed parameters"."Fixed Parameters". 11.2.1. Reference Definition <provide a full bibliographic reference to an immutable document> <provide a specific section reference and additional clarifications, if needed> 11.2.2. Fixed Parameters <list and specify Fixed Parameters, input factors that must be determined and embedded in the measurement system for use when needed> 11.3. Method of Measurement This category includes columns for references to relevant sections of the immutable document(s) and any supplemental information needed to ensure an unambiguous method for implementations. 11.3.1. Reference Method <for the metric, insert relevant section references and supplemental info> 11.3.2. Packet Stream Generation <provide a list of generationparametersParameters and section/spec references if needed> 11.3.3. Traffic Filtering (Observation) Details This categoryincludes the measured results based on a filtered version of the packets observed, and this section referenceprovides the filter details (whenpresent).present), which qualify the set of packets that contribute to the measured results from among all packets observed. <provide a section reference> 11.3.4. Sampling Distribution <insert time distribution details, or how this is different from the filter> 11.3.5. Runtime Parameters and Data Format Runtime Parameters are input factors that must be determined, configured into the measurement system, and reported with the results for the context to be complete. <provide a list ofruntime parametersRuntime Parameters and their data formats> 11.3.6. Roles <list the names of the differentrolesRoles from the Measurement Method> 11.4. Output This category specifies all details of the output of measurements using the metric. 11.4.1. Type <insert the name of the output type -- raw results or a selected summary statistic> 11.4.2. Reference Definition <describe the reference data format for each type of result> 11.4.3. Metric Units <insert units for the measured results, and provide the reference specification> 11.4.4. Calibration <insert information on calibration> 11.5. Administrative Items This category provides administrative information. 11.5.1. Status <provide status: 'Current' or 'Deprecated'> 11.5.2. Requester <provide a person's name, an RFC number, etc.> 11.5.3. Revision <provide the revision number:1.0>starts at 0> 11.5.4. Revision Date <provide the date, in YYYY-MM-DD format> 11.6. Comments and Remarks <list any additional (informational) details for this entry> 12. References 12.1. Normative References [RFC2026] Bradner, S., "The Internet Standards Process -- Revision 3", BCP 9, RFC 2026, DOI 10.17487/RFC2026, October 1996, <https://www.rfc-editor.org/info/rfc2026>. [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate Requirement Levels", BCP 14, RFC 2119, DOI 10.17487/RFC2119, March 1997, <https://www.rfc-editor.org/info/rfc2119>. [RFC2330] Paxson, V., Almes, G., Mahdavi, J., and M. Mathis, "Framework for IP Performance Metrics", RFC 2330, DOI 10.17487/RFC2330, May 1998, <https://www.rfc-editor.org/info/rfc2330>. [RFC3986] Berners-Lee, T., Fielding, R., and L. Masinter, "Uniform Resource Identifier (URI): Generic Syntax", STD 66, RFC 3986, DOI 10.17487/RFC3986, January 2005, <https://www.rfc-editor.org/info/rfc3986>. [RFC5644] Stephan, E., Liang, L., and A. Morton, "IP Performance Metrics (IPPM): Spatial and Multicast", RFC 5644, DOI 10.17487/RFC5644, October 2009, <https://www.rfc-editor.org/info/rfc5644>. [RFC6390] Clark, A. and B. Claise, "Guidelines for Considering New Performance Metric Development", BCP 170, RFC 6390, DOI 10.17487/RFC6390, October 2011, <https://www.rfc-editor.org/info/rfc6390>. [RFC6576] Geib, R., Ed., Morton, A., Fardid, R., and A. Steinmitz, "IP Performance Metrics (IPPM) Standard Advancement Testing", BCP 176, RFC 6576, DOI 10.17487/RFC6576, March 2012, <https://www.rfc-editor.org/info/rfc6576>. [RFC7799] Morton, A., "Active and Passive Metrics and Methods (with Hybrid Types In-Between)", RFC 7799, DOI 10.17487/RFC7799, May 2016, <https://www.rfc-editor.org/info/rfc7799>. [RFC8126] Cotton, M., Leiba, B., and T. Narten, "Guidelines for Writing an IANA Considerations Section in RFCs", BCP 26, RFC 8126, DOI 10.17487/RFC8126, June 2017, <https://www.rfc-editor.org/info/rfc8126>. [RFC8174] Leiba, B., "Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words", BCP 14, RFC 8174, DOI 10.17487/RFC8174, May 2017, <https://www.rfc-editor.org/info/rfc8174>. 12.2. Informative References [RFC2681] Almes, G., Kalidindi, S., and M. Zekauskas, "A Round-trip Delay Metric for IPPM", RFC 2681, DOI 10.17487/RFC2681, September 1999, <https://www.rfc-editor.org/info/rfc2681>. [RFC3432] Raisanen, V., Grotefeld, G., and A. Morton, "Network performance measurement with periodic streams", RFC 3432, DOI 10.17487/RFC3432, November 2002, <https://www.rfc-editor.org/info/rfc3432>. [RFC3550] Schulzrinne, H., Casner, S., Frederick, R., and V. Jacobson, "RTP: A Transport Protocol for Real-Time Applications", STD 64, RFC 3550, DOI 10.17487/RFC3550, July 2003, <https://www.rfc-editor.org/info/rfc3550>. [RFC3611] Friedman, T., Ed., Caceres, R., Ed., and A. Clark, Ed., "RTP Control Protocol Extended Reports (RTCP XR)", RFC 3611, DOI 10.17487/RFC3611, November 2003, <https://www.rfc-editor.org/info/rfc3611>. [RFC4148] Stephan, E., "IP Performance Metrics (IPPM) Metrics Registry", BCP 108, RFC 4148, DOI 10.17487/RFC4148, August 2005, <https://www.rfc-editor.org/info/rfc4148>. [RFC5474] Duffield, N., Ed., Chiou, D., Claise, B., Greenberg, A., Grossglauser, M., and J. Rexford, "A Framework for Packet Selection and Reporting", RFC 5474, DOI 10.17487/RFC5474, March 2009, <https://www.rfc-editor.org/info/rfc5474>. [RFC5475] Zseby, T., Molina, M., Duffield, N., Niccolini, S., and F. Raspall, "Sampling and Filtering Techniques for IP Packet Selection", RFC 5475, DOI 10.17487/RFC5475, March 2009, <https://www.rfc-editor.org/info/rfc5475>. [RFC5477] Dietz, T., Claise, B., Aitken, P., Dressler, F., and G. Carle, "Information Model for Packet Sampling Exports", RFC 5477, DOI 10.17487/RFC5477, March 2009, <https://www.rfc-editor.org/info/rfc5477>. [RFC6035] Pendleton, A., Clark, A., Johnston, A., and H. Sinnreich, "Session Initiation Protocol Event Package for Voice Quality Reporting", RFC 6035, DOI 10.17487/RFC6035, November 2010, <https://www.rfc-editor.org/info/rfc6035>. [RFC6248] Morton, A., "RFC 4148 and the IP Performance Metrics (IPPM) Registry of Metrics Are Obsolete", RFC 6248, DOI 10.17487/RFC6248, April 2011, <https://www.rfc-editor.org/info/rfc6248>. [RFC6991] Schoenwaelder, J., Ed., "Common YANG Data Types", RFC 6991, DOI 10.17487/RFC6991, July 2013, <https://www.rfc-editor.org/info/rfc6991>. [RFC7012] Claise, B., Ed. and B. Trammell, Ed., "Information Model for IP Flow Information Export (IPFIX)", RFC 7012, DOI 10.17487/RFC7012, September 2013, <https://www.rfc-editor.org/info/rfc7012>. [RFC7014] D'Antonio, S., Zseby, T., Henke, C., and L. Peluso, "Flow Selection Techniques", RFC 7014, DOI 10.17487/RFC7014, September 2013, <https://www.rfc-editor.org/info/rfc7014>. [RFC7594] Eardley, P., Morton, A., Bagnulo, M., Burbridge, T., Aitken, P., and A. Akhter, "A Framework for Large-Scale Measurement of Broadband Performance (LMAP)", RFC 7594, DOI 10.17487/RFC7594, September 2015, <https://www.rfc-editor.org/info/rfc7594>. [RFC7679] Almes, G., Kalidindi, S., Zekauskas, M., and A. Morton, Ed., "A One-Way Delay Metric for IP Performance Metrics (IPPM)", STD 81, RFC 7679, DOI 10.17487/RFC7679, January 2016, <https://www.rfc-editor.org/info/rfc7679>. [RFC8912] Morton, A., Bagnulo, M., Eardley, P., and K. D'Souza, "Initial Performance Metrics Registry Entries", RFC 8912, DOI 10.17487/RFC8912,September 2020,November 2021, <https://www.rfc-editor.org/info/rfc8912>. Acknowledgments Thanks to Brian Trammell and Bill Cerveny, IPPMchairs,co-chairs during the development of this memo, for leadingsomeseveral brainstorming sessions on this topic. Thanks to Barbara Stark and Juergen Schoenwaelder for the detailed feedback and suggestions. Thanks to Andrew McGregor for suggestions on metric naming. Thanks to Michelle Cotton for her early IANA review, and to Amanda Baber for answering questions related to the presentation of theregistryRegistry and accessibility of the complete template via URL. Thanks to Roni Even for his review and suggestions to generalize the procedures. Thanks to all of the Area Directors for their reviews. Authors' Addresses Marcelo Bagnulo Universidad Carlos III de Madrid Av. Universidad 30 28911 Leganes Madrid Spain Phone: 34 91 6249500 Email: marcelo@it.uc3m.es URI: http://www.it.uc3m.es Benoit ClaiseCisco Systems, Inc. De Kleetlaan 6a b1 1831 Diegem BelgiumHuawei Email:bclaise@cisco.combenoit.claise@huawei.com Philip Eardley BT Adastral Park, Martlesham Heath Ipswich United Kingdom Email: philip.eardley@bt.com Al Morton AT&T Labs 200 Laurel Avenue South Middletown, NJ 07748 United States of America Email: acmorton@att.com Aamer Akhter Consultant 118 Timber Hitch Cary, NC United States of America Email: aakhter@gmail.com