Network Working Group INTERNET-DRAFT Expires in: December 2006 Scott Poretsky Reef Point Systems Vijay Gurbani Lucent Technologies Carol Davids Illinois Institute of Technology June 2006 Terminology for Benchmarking SIP Networking Devices Intellectual Property Rights (IPR) statement: By submitting this Internet-Draft, each author represents that any applicable patent or other IPR claims of which he or she is aware have been or will be disclosed, and any of which he or she becomes aware will be disclosed, in accordance with Section 6 of BCP 79. Status of this Memo Internet-Drafts are working documents of the Internet Engineering Task Force (IETF), its areas, and its working groups. Note that other groups may also distribute working documents as Internet-Drafts. Internet-Drafts are draft documents valid for a maximum of six months and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use Internet-Drafts as reference material or to cite them other than as "work in progress." The list of current Internet-Drafts can be accessed at http://www.ietf.org/ietf/1id-abstracts.txt. The list of Internet-Draft Shadow Directories can be accessed at http://www.ietf.org/shadow.html. Copyright Notice Copyright (C) The Internet Society (2006). Poretsky, Gurbani, Davids [Page 1] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices ABSTRACT This document provides a terminology for benchmarking SIP performance in networking devices. Terms are included for test components, test setup parameters, and performance benchmark metrics for black-box benchmarking of SIP networking devices. The Performance Benchmark Metrics are obtained for the SIP Control Plane and Media Plane. The terms are intended for use in a companion Methodology document for complete performance characterization of a device in a variety of network conditions making it possible to compare performance of different devices. It is critical to provide Test Setup Parameters and a Methodology document for SIP performance benchmarking because SIP allows a wide range of configuration and operational conditions that can influence performance benchmark measurements. It is necessary to have terminology and methodology standards to ensure that reported benchmarks have consistent definition and were obtained following the same procedures. Benchmarks can be applied to a variety of SIP networking device including SIP Servers, Session Border Controllers (SBCs), and SIP-Aware Stateful Firewalls (SASFs) as a Device Under Test (DUT) or in combination as a System Under Test (SUT). Poretsky, Gurbani, Davids [Page 2] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices Table of Contents 1. Introduction .................................................4 2. Existing definitions .........................................4 3. Term definitions..............................................5 3.1 Test Components...........................................5 3.1.1 SIP Control Plane....................................5 3.1.2 SIP Media Plane......................................5 3.1.3 Emulated Agents......................................6 3.1.4 Session Server.......................................6 3.1.5 SIP-Aware Stateful Firewall..........................6 3.1.6 Control Session......................................7 3.1.7 Active Control Session...............................7 3.1.8 Completed Control Session............................7 3.1.9 Session Attempt......................................8 3.1.10 Session Setup.......................................8 3.1.11 Session Teardown....................................8 3.2 Test Setup Parameters.....................................9 3.2.1 SIP Transport Protocol...............................9 3.2.2 Intended Session Duration............................9 3.2.3 Measured Session Duration............................10 3.2.4 Intended Session Rate................................10 3.2.5 Intended Session Attempt Rate........................11 3.2.6 Media Streams per Session............................11 3.2.7 Media Packet Size....................................12 3.2.8 Media Offered Load, per Media Stream.................12 3.2.9 Media Offered Load, Aggregate........................12 3.2.10 Session Hold Time...................................13 3.3 Benchmarks................................................13 3.3.1 Registration Rate....................................13 3.3.2 Successful Session Rate..............................13 3.3.3 Successful Session Attempt Rate......................14 3.3.4 Standing Session Capacity............................14 3.3.5 Session Completion Rate..............................15 3.3.6 Busy Hour Session Connects (BHSC)....................15 3.3.7 Busy Hour Session Attempts (BHSA)....................16 3.3.8 Session Setup Delay..................................16 3.3.9 Session Teardown Delay...............................17 3.3.10 Standing Sessions...................................17 3.3.11 IM Rate.............................................18 3.3.12 Presence Rate.......................................18 4. IANA Considerations...........................................19 5. Security Considerations.......................................19 6. Acknowledgements..............................................19 7. References....................................................20 8. Author's Address..............................................21 9. Full Copyright Statement......................................22 Poretsky, Gurbani, Davids [Page 3] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 1. Introduction Service Providers are now planning VoIP and Multimedia network deployments using the IETF developed Session Initiation Protocol (SIP). VoIP has led to development of new networking devices including SIP Servers, Session Border Controllers, and SIP-Aware Stateful Firewalls. The mix of voice and IP functions in this variety of devices has produced inconsistencies in vendor reported performance metrics and has caused confusion in the service provider community. SIP allows a wide range of configuration and operational conditions that can influence performance benchmark measurements. It is important to be able to correlate a signalling measurement with the media plane measurements to determine the system performance. When defining SIP performance benchmarks it is critical to also provide definitions for Test Setup Parameters and a corresponding Methodology document for SIP performance benchmarking. This enables benchmarks to be understood, fairly compared, and repeatable. This document provides the benchmarking terms for performance benchmarking the SIP control and media planes. Terms are included for Test Components, Test Setup Parameters, and Benchmarks. All benchmarks are black-box measurements of the SIP Control and Media Planes. It is intended that these terms be used in a companion Methodology document. The benchmarks can be used to compare a variety of SIP networking device including SIP Servers, Session Border Controllers (SBCs), and SIP-Aware Stateful Firewalls (SASFs) as a Device Under Test (DUT) or in combination as a System Under Test (SUT). 2. Existing definitions The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in BCP 14, RFC 2119. RFC 2119 defines the use of these key words to help make the intent of standards track documents as clear as possible. While this document uses these keywords, this document is not a standards track document. The term Throughput is defined in RFC 2544. Poretsky, Gurbani, Davids [Page 4] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3. Term Definitions 3.1 Test Components 3.1.1 SIP Control Plane Definition: SIP Signaling to establish and terminate a SIP session. Discussion: It is critical to performance benchmark a device's SIP Control Plane. SIP sessions can be established directly between two SIP User Agent, via a Server, or via a series of Servers. Measurement Units: N/A Issues: None See Also: SIP Media Plane Emulated Agents 3.1.2 SIP Media Plane Definition: Media transfer that occurs directly between two SIP end stations after a SIP session is established. Discussion: The Media Plane is analagous to the Data Plane. Packets for the SIP Control Plane and the SIP Media Plane traverse different paths, which can produce variation in performance. For this reason it is necessary to benchmark performance of the SIP Control Plane and the SIP Media Plane. Measurement Units: N/A Issues: None See Also: SIP Control Plane Emulated Agents Poretsky, Gurbani, Davids [Page 5] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.1.3 Emulated Agents Definition: Device in test topology that initates/responds to SIP signaling as a session endpoint and sources/receives associated media for established connections. This device is a Emulated Agent. Discussion: Measurement Units: N/A Issues: None See Also: SIP Media Plane SIP Control Plane 3.1.4 Session Server Definition: Device in test topology that acts as proxy between Emulated Agents. This device is either a DUT or componenet of a SUT. Discussion: The Session Server MAY be a Proxy Server or Session Border Controller (SBC). Measurement Units: N/A Issues: None See Also: SIP Control Plane 3.1.5 SIP-Aware Stateful Firewall Definition: Device in test topology that provides SIP DoS Protection for the Emulated Agents and Session Server Discussion: SIP-Aware Stateful Firewalls MAY include additional functionality, but that is beyond the scope of this work item. When testing a SIP-Aware Stateful Firewall, the SIP-Aware Stateful Firewall MUST be tested with the Session Server as a SUT. The Session Server may have an integrated SIP-Aware Stateful Firewall so that it is a DUT. Poretsky, Gurbani, Davids [Page 6] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices Measurement Units: N/A Issues: None See Also: 3.1.6 Control Session Definition: A SIP control session that includes setup and teardown from BYE exchanged between Emulated Agent and DUT/SUT. Discussion: An Active Control Session MAY have media. Application of media is test case dependent. Measurement Units: N/A Issues: None See Also: 3.1.7 Active Control Session Definition: An established SIP control session at the DUT/SUT. Discussion: An Active Control Session MAY have media. Application of media is test case dependent. Measurement Units: N/A Issues: None See Also: 3.1.8 Completed Control Session Definition: A SIP control session that includes setup and teardown from BYE exchanged between Emulated Agent and DUT/SUT. Poretsky, Gurbani, Davids [Page 7] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices Discussion: An Active Control Session MAY have media. Application of media is test case dependent. Measurement Units: N/A Issues: None See Also: 3.1.9 Session Attempt Definition: An attempt by the Emulated Agent to establish a session on the DUT/SUT. Discussion: Measurement Units: N/A Issues: None See Also: 3.1.10 Session Setup Definition: A successful session establishment betweeen the Emulated Agent and DUT/SUT. Discussion: Measurement Units: N/A Issues: None See Also: 3.1.11 Session Teardown Definition: A successful session disconnect betweeen the Emulated Agent and DUT/SUT. Discussion: Poretsky, Gurbani, Davids [Page 8] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices Measurement Units: N/A Issues: None See Also: 3.2 Test Setup Parameters 3.2.1 SIP Transport Protocol Definition: Description of the protocol used for transport of the SIP Control Plane. Discussion: Performance benchmarks may vary for the same SIP networking device as TCP, UDP, TLS, SCTP, or another is used. For this reason it is necessary to measure the SIP Performance Benchmarks with use of various transport protocols. Performance Benchmarks MUST report the SIP Transport Protocol used to obtain the results. Measurement Units: TCP or UDP Issues: None See Also: 3.2.2 Intended Session Duration Definition: Configuration on the Emulated Agent for time from session establishment to BYE. Discussion: The Intended Session Duration is configured on the Emulated Agent.This value is used for all sessions. When benchmarking Intended Session Attempt Rate instead of Intended Session Rate then the effective value of the Intended Session Duration is infinite. Poretsky, Gurbani, Davids [Page 9] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices Measurement Units: seconds Issues: None See Also: Intended Session Rate Intended Session Attempt Rate 3.2.3 Measured Session Duration Definition: Average measurement on the DUT/SUT for time from session establishment to BYE. Discussion: The value of the Measured Session Duration MAY not equal the Intended Session Duration. This parameter requires that the session duration be measured for every session through the test duration. Measurement Units: seconds Issues: None See Also: Intended Session Rate Intended Session Attempt Rate 3.2.4 Intended Session Attempt Rate Definition: Configuration on the Emulated Agent for number of sessions to establish per continuous one-second intervals with the sessions not terminated (remain established for test duration). Discussion: the Intended Session Attempt Rate can cause variation in performance benchmark measurements. Measurement Units: session attempts per second (saps) Issues: None See Also: Measured Session Duration Intended Session Rate Poretsky, Gurbani, Davids [Page 10] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.2.5 Intended Session Rate Definition: Configuration on the Emulated Agent for number of sessions to establish per continuous one-second intervals with the sessions terminated at the Measured Session Duration. Discussion: For a fixed value of Intended Session Rate and Intended Session Attempt Rate, the Intended Session Rate is more stressful on the DUT/SUT since it must process sessions setups and teardowns conncurrently during each one-second interval. Measurement Units: sessions per second (sps) Issues: None See Also: Measured Session Duration Intended Session Attempt Rate 3.2.6 Media Streams per Session Definition: Fixed number of media streams offered for each session. Discussion: For a single benchmark test, all sessions use the same number of Media Streams per Session. Presence of media streams and the number of media streams per session can cause variation in performance benchmark measurements. The RECOMMENDED values for Media Streams per Session are 0,1,2,3,4, but higher values can be used. Measurement Units: media streams per session (msps) Issues: At this time Media benchmarking needs to be defined. A Session Server may participate in the Control session, but not media flow. It is possible that a SUT is required to benchmark media. If a SUT, and not a DUT, is required then what are the specific components of the SUT? Server, Relay, Gateway, Firewall? This may be addressed in the methodology by creating different test cases for each possible scenario. Poretsky, Gurbani, Davids [Page 11] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.2.7 Media Packet Size Definition: Fixed size of packets used for media. Discussion: Measurement Units: bytes Issues: None See Also: 3.2.8 Media Offered Load, per Media Stream Definition: The constant amount of media traffic offered by the Emulated Agent to the DUT/SUT for each media stream. Discussion: For a single benchmark test, all sessions use the same Media Offered Load, per Media Stream. Measurement Units: pps Issues: None See Also: 3.2.9 Media Offered Load, Aggregate Definition: The total amount of media traffic offered by the Emulated Agent to the DUT/SUT. Discussion: Measurement Units: pps Issues: None See Also: Poretsky, Gurbani, Davids [Page 12] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.2.10 Session Hold Time Definition: The duration in which media flows from the Tester to the DUT for an Active Control Session. Discussion: Measurement Units: seconds Issues: None See Also: 3.3 Benchmarks 3.3.1 Registration Rate Definition: Maximum number of sessions registrations successfully completed by the DUT/SUT Discussion: This benchmark is obtained with zero failure in which 100% of the registrations attempteded by the Emulated Agent are successfuly completed by the DUT/SUT. The maximum value is obtained by testing to failure. Measurement Units: registrations per second (sps) Issues: None See Also: 3.3.2 Session Rate Definition: Maximum number of sessions successfully established per continuous one-second intervals with the sessions remaining active. Discussion: This benchmark is obtained with zero failure in which 100% of the sessions introduced by the Emulated Agent successfuly establish. The maximum value is obtained by testing to failure. Poretsky, Gurbani, Davids [Page 13] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices Measurement Units: sessions per second (sps) Issues: None See Also: Active Control Session Intended Session Rate 3.3.3 Session Attempt Rate Definition: Configuration on the Emulated Agent for number of sessions to establish per continuous one-second intervals with the sessions terminated at the Measured Session Duration. Discussion: Measurement Units: session attempts per second (saps) Issues: None See Also: Intended Session Attempt Rate Successful Session Rate 3.3.4 Standing Session Capacity Definition: The maximum number of SIP sessions that the DUT/SUT can simultaneously have established. Discussion: The Standing Session Capacity must be reported with the Successful Session Attempt Rate used to reach the maximum. Measurement Units: sessions Issues: None See Also: Successful Session Attempt Rate Poretsky, Gurbani, Davids [Page 14] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.3.5 Session Completion Rate Definition: The percentage of sessions that successfully establish and terminate for the duration of a benchmarking test. Discussion: Session Completion Rate is a benchmark for session success. The duration for measuring this benchmark is to be specified in the Methodology. When Session Completion Rate is reported, the Successful Session Rate, Measured Session Duration, Media Streams per Session, and Media Offered Load per Media Stream MUST also be reported. Measurement Units: Percentage, % Issues: None See Also: Successful Session Rate Measured Session Duration Media Streams per Session Media Offered Load per Media Stream 3.3.6 Busy Hour Session Connects (BHSC) Definition: The number of sessions that successfully establish and terminate for one-hour. Discussion: When BHSC is reported, the Successful Session Rate, Measured Session Duration, Media Streams per Session, and Media Offered Load per Media Stream MUST also be reported. This benchmark is useful for long duration performance tests. Measurement Units: Percentage, % Issues: None See Also: Successful Session Rate Measured Session Duration Media Streams per Session Media Offered Load per Media Stream Poretsky, Gurbani, Davids [Page 15] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.3.7 Busy Hour Session Attempts (BHSA) Definition: The number of sessions that successfully establish for one-hour. Discussion: Sessions do not terminate for the BHSA. When BHSA is reported, the Successful Session Rate, Measured Session Duration, Media Streams per Session, and Media Offered Load per Media Stream MUST also be reported. This benchmark is useful for long duration performance tests. Measurement Units: Percentage, % Issues: None See Also: Successful Session Rate Measured Session Duration Media Streams per Session Media Offered Load per Media Stream 3.3.8 Session Setup Delay Definition: The average time for a session to establish. Discussion: Time is from the Emulated Agent to signal the first INVITE. Session Setup Delay MUST be measured for every established session to calculate the average. Session Setup Delay MUST be measured at the Successful Setup Attempt Rate. Measurement Units: msec Issues: None See Also: Successful Setup Attempt Rate Poretsky, Gurbani, Davids [Page 16] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.3.9 Session Teardown Delay Definition: The average time for a session to teardown. Discussion: Time is from the Emulated Agent to signal the BYE. Session Teardown Delay MUST be measured for every established session to calculate the average. Session Setup Delay MUST be measured with the rate of teardowns configured to the value of the Successful Setup Attempt Rate. Measurement Units: msec Issues: None See Also: Successful Setup Attempt Rate 3.3.10 Standing Sessions Definition: Measurement of the number of Active Control Sessions concurrently established on the DUT/SUT. Discussion: The number of Standing Sessions is influenced by the Session Duration and the Session Rate (or Session Attempt Rate). Benchmarks MUST be reported with the maximum and average Standing Sessions for the DUT/SUT. In order to determine the maximum and average Standing Sessions on the DUT/SUT for the duration of the test it is necessary to make periodic measurements of the number of Standing Sessions on the DUT/SUT. The recommended value for the measurement period is 1 second. Measurement Units: sessions Issues: None See Also: Active Control Sessions Session Duration Session Rate Session Attempt Rate Poretsky, Gurbani, Davids [Page 17] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 3.3.11 IM Rate Definition: Maximum number of IM messages completed successfully by the DUT/SUT. Discussion: For a UAS, the definition of success is the receipt of an IM request and the subsequent sending of a final response. For a UAC, the definition of success is the sending of an IM request and the receipt of a final response to it. For a proxy, the definition of success is as follows: a) the number of IM requests it receives from the upstream client MUST be equal to the number of IM requests it sent to the downstream server; and b) the number of IM responses it receives from the downstream server MUST be equal to the number of IM requests sent to the downstream server; and c) the number of IM responses it sends to the upstream client MUST be equal to the number of IM requests it received from the upstream client. Measurement Units: IM messages per second Issues: None. See Also: 3.3.12 Presence Rate Definition: Maximum number of presence notifications sent out by the DUT/SUR acting as a Presence Agent [Ro04]. Discussion: The intent of this benchmark is to assess the throughput of a Presence Agent (PA, see [Ro04]). The PA will accept subscriptions from watchers, and when the target of the subscription is registered with the PA (who is acting as a registrar), a notification is generated to the watcher. This benchmark will use the presence event package as documented in [Ro04]. Measurement Units: Presence notifications sent out per second See Also: 3.3.1 Registration Rate. The Presence notification rate will be less than or equal to the Registration Rate. Poretsky, Gurbani, Davids [Page 18] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 4. IANA Considerations This document requires no IANA considerations. 5. Security Considerations Documents of this type do not directly affect the security of Internet or corporate networks as long as benchmarking is not performed on devices or systems connected to production networks. Security threats and how to counter these in SIP and the media layer is discussed in RFC3261, RFC3550, and RFC3711 and various other drafts. This document attempts to formalize a set of common terminology for benchmarking SIP networks. 6. Acknowledgements The authors would like to thank Keith Drage and Daryl Malas for their contributions to this document. Poretsky, Gurbani, Davids [Page 19] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 7. References 7.1 Normative References [Ba91] Bradner, S. "Benchmarking Terminology for Network Interconnection Devices", IETF RFC 1242, July 1991. [Ba99] Bradner, S. and McQuaid, J., "Benchmarking Methodology for Network Interconnect Devices", IETF RFC 2544, March 1999. [Ma98] Mandeville, R., "Benchmarking Terminology for LAN Switching Devices", IETF RFC 2285, February 1998. [Ro02] Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston, A., Peterson, J., Sparks, R., Handley, M. and E. Schooler, "SIP: Session Initiation Protocol", IETF RFC 3261, June 2002. [Ro04] Rosenberg, J., "A Presence Event Package for the Session Initiation Protocol (SIP)," IETF RFC 3856, August 2004. [Ga05] Garcia-Martin, M., "Input 3rd-Generation Partnership Project (3GPP) Release 5 Requirements on the Session Initiation Protocol (SIP)", IETF RFC 4083, May 2005. [Sp06] Sparks, R., et al, "Session Initiation Protocol (SIP) Torture Test Messages", IETF RFC 4475, June 2006. [Ma06] Malas, D. "SIP Performance Metrics", draft-malas-performance-metrics-01.txt, work in progress, June 2006. [Li06] Lingle, K., Mule, J., Maeng, J., Walker, D., "Management Information Base for the Session Initiation Protocol (SIP)", draft-ietf-sip-mib-10.txt, work in progress, March 2006. 7.2 Informative References None Poretsky, Gurbani, Davids [Page 20] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices 8. Author's Address Scott Poretsky Reef Point Systems 8 New England Executive Park Burlington, MA 01803 USA Phone: + 1 508 439 9008 EMail: sporetsky@reefpoint.com Vijay Gurbani 2000 Lucent Lane Lucent Technologies Room 6G-440 Naperville, IL 60566 USA Phone: + 1 630 224 0216 Email: vkg@lucent.com Carol Davids Illinois Institute of Technology Rice Campus 201 East Loop Road Wheaton, IL 60187 USA Phone: + 1 630 682 6000 Email: davids@iit.edu Poretsky, Gurbani, Davids [Page 21] INTERNET-DRAFT Benchmarking Terminology for June 2006 SIP Networking Devices Full Copyright Statement Copyright (C) The Internet Society (2006). This document is subject to the rights, licenses and restrictions contained in BCP 78, and except as set forth therein, the authors retain all their rights. This document and the information contained herein are provided on an "AS IS" basis and THE CONTRIBUTOR, THE ORGANIZATION HE/SHE REPRESENTS OR IS SPONSORED BY (IF ANY), THE INTERNET SOCIETY AND THE INTERNET ENGINEERING TASK FORCE DISCLAIM ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE ANY RIGHTS OR ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Intellectual Property The IETF takes no position regarding the validity or scope of any Intellectual Property Rights or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; nor does it represent that it has made any independent effort to identify any such rights. Information on the procedures with respect to rights in RFC documents can be found in BCP 78 and BCP 79. Copies of IPR disclosures made to the IETF Secretariat and any assurances of licenses to be made available, or the result of an attempt made to obtain a general license or permission for the use of such proprietary rights by implementers or users of this specification can be obtained from the IETF on-line IPR repository at http://www.ietf.org/ipr. The IETF invites any interested party to bring to its attention any copyrights, patents or patent applications, or other proprietary rights that may cover technology that may be required to implement this standard. Please address the information to the IETF at ietf- ipr@ietf.org. Acknowledgement Funding for the RFC Editor function is currently provided by the Internet Society. Poretsky, Gurbani, Davids [Page 22]