AT&T has completed trials of 10 Gbps XGS-PON leveraging Open Source Access Manager Hardware Abstraction (OSAM-HA) software in Atlanta and Dallas.
OSAM-HA, which was formerly known as VOLTHA 1.0 (Virtual Optical Line Termination Hardware Abstraction) software-defined access specification, was released into the Open Networking Foundation (ONF) in October.
OSAM, which leverages the ONAP specification, is a vendor-agnostic operational suite for managing consumer and business broadband access network elements and capabilities. This is separate from vendor-specific Access Element Management Systems.
During the field trials, AT&T said the XGS-PON system tested multigigabit high-speed internet traffic. The service provider used a virtualized Broadband Network Gateway function to manage subscribers.
Eddy Barker, assistant VP of access architecture and design for AT&T, told FierceTelecom that given the different geographies, it got different results from the two trials.
“We did trials in Atlanta and Dallas and each one varied quite a bit,” Barker said. “We have quite a large vendor community we’re using.”
This community includes a wide mix of hardware and white box vendor options built around silicon from Broadcom and open source software. However, AT&T would not reveal the identity of the vendors used in the trials.
“We used several of our vendors, but not all of them,” Barker said. “We only picked a few of them do to trials with and we don’t want to disadvantage one versus the other in naming names.”
In Atlanta, AT&T tested XGS-PON for consumer and business customers. As part of the trial, AT&T tested the ability to support multigigabit services to figure out how to prepare its network for the next bandwidth surge.
“The trials were about future proofing technology and getting capacity and higher speeds where we need it for consumers and businesses should someone want multigigabit services,” Barker said. “The XGS-PON network could also support infrastructure like small cells.”
One of the key challenges for AT&T was being able to interoperate with existing services like its U-Verse IPTV service.
“Some of the things we struggled with as we have done this in the labs previously was doing IP multicast, for example, which was how our whole U-Verse IPTV structure worked,” Barker said. “Out in the field, we have a mix of U-verse and satellite TV and any provider needs to support multiple types of networks.”
ONAP drives openness, commonality
Instead of deploying islands of technology that have SDN control, AT&T wants to orchestrate the entire end-to-end network through ONAP.
As AT&T’s first virtual access project within the Linux Foundation and will use the first iteration of OSAM-HA technology.
ONAP struck a major coup this week as fellow telco Verizon joined the ONAP consortium via The Linux Foundation as a member this week.
“ONAP is all about preventing fragmentation for vendors and service providers,” Barker said. “We get the value and we’re all going to accomplish our objectives how we operate networks and build networks more efficiently.”
Barker says reflects that if more service providers join ONAP it will drive more industry-wide collaboration.
“The more the big providers we can get on board, the better off we all are,” Barker said. “They are going to come and have their own ideas and contributions and ideas and we’ll put that together with all the members of the organization and make us all better.”
A key focus of AT&T’s next-gen PON build out and trials is to simplify network operations.
AT&T will place its attention for its last mile PON architecture in 2018 to ensure its operations team can be more efficient.
The service provider expects the architecture for XGS-PON to be operational by 2019.
“In 2018, the focus is going to be on an open source project we have with ONF, ONAP and The Linux Foundation to make sure we capture the operational tooling that our operations teams need,” Barker said. “When we roll this out, we want to not only capture the value adds with lower cost development and optimization, but also capitalize on being able to take the virtual network functions and run on them on our edge cloud infrastructure.”
At the same time, the service provider wants the operations team to be able to take advantage of the new capabilities immediately.
“We want to get operations comfortable with what we’re deploying so they don’t feel taking a step forward means they have to take a step back temporarily,” Barker said.
To support the operations teams, AT&T’s OSAM-HA will provide a suite of operational tools that exist in ONAP with an interface with the ONF software stacks.
This structure will give the operations teams all the elements that they would have traditionally got from a proprietary vendor element management system (EMS) and apply it across multiple access technologies and vendor’s equipment.
“This gives them all of the interfaces they are accustomed to whether it’s automated network mapping, troubleshooting tools – all the things they need to troubleshoot the network that now be distributed,” Barker said. “Now the network is going through end to end where some network elements that are traditional and others that are VNFs.”
Barker added that “since the network paths are more complicated it will be hard to troubleshoot if you don’t have the right tools to get to the bottom of those things quickly.”
Driving PON coexistence
For the XGS-PON technology to work within its widespread existing GPON networks, AT&T also worked coexistence element.
In these trials, AT&T illustrated how GPON and XGS-PON wavelengths could both exist across a single fiber interface.
Barker said that the success of the coexistence between GPON and XGS-PON came as a surprise.
“The coexistence was something that we did not initially plan in that trial,” Barker said. “We put it in and it worked great.”
While the IEEE and ITU have been able to develop common PON networking standards through the Full-Service Access Network (FSAN) group, AT&T is hopeful the industry will be able to have common standards.
“The IEEE and ITU have been able to standardize on the wavelength structures, which is really critical for the industry because it gives us the ability to deploy fiber,” Barker said. “As we go forward, we expect to improve on that by maybe instead of having separate standards for fiber to the premises, we can bring that together as we go the next generations to these things.”
Having commonality across standards will enable AT&T and other service providers deploying fiber to help offset build out costs.
“If you look at the cost of labor, which is the most expensive part of all this fiber business, having common standards is going to be critical as we densify small cells,” Barker said. “We’re trying to make sure that the optics are as inexpensive as possible to be able to mass deploy them because it’s going to be hard to drive down labor costs other than using more inexpensive installation techniques like microtrenching.”
This article was updated on January 17 with additional information from AT&T.