While Adtran has a strong foundation in the world of fiber optics, it has also embraced software-defined networking (SDN), according to Robert Conger, CTO of Americas.
Adtran's Mosaic platform is a software-defined access solution that provides vendors and service providers alike with modular, component-based architectures that are open and programmable. Mosaic Cloud works with Adtran's software-defined networking controller as well as controllers from ONOS and OpenDaylight.
RELATED: Q&A—Adtran's Robert Conger: The telecom industry needs a clear commercial plan
To further underscore Adtran's commitment to SDN, Adtran was one of four vendors that recently joined the Open Networking Forum at the partner level by agreeing to pay $500,000 a year with a five-year commitment.
In this second Q&A with Conger, which was lightly edited for context and length, the executive talks about the evolution of SDN and the potential of machine learning and artificial intelligence.
FierceTelecom: SDN has been around for a while now. Where do you see it today and how will it evolve going forward?
Robert Conger: A lot of our focus has been more on SDN for access, SDN for business connectivity and SD-WAN type use cases. The biggest thing that's different between today and before is a lot of the open source groups that are really needed to kind of drive it have become a lot more prevalent and the customers are adopting it.
Before, SDN was kind of like a philosophy where everyone said "Yeah, that's great." But ultimately what operators really wanted was this true, open, multivendor environment that can support all the different devices and you can plug it into Linux and other stuff. Now everyone knows how to build devices that work in that environment.
Now you can make equipment that programs into that environment if you write applications. You have applications now that can plug into that environment. Now that that's well defined and there's a much better path for it. Now that it's clear and the operators have created a market for it. That's helpful. So before, it was more just an architectural vision, and not necessarily an environment. It wasn't really well defined on how it actually would get rolled out. Now that's much more defined than it was in the past.
FierceTelecom: In the early "classic" SDN days we heard a lot about what it would do for service providers' capex and opex. Now it's more about enabling automation?
Conger: I think that the opex and capex things are a little bit more clear now. Everyone knows what the automation savings will eventually get them. Automation will be one of these things that's almost inherent. You have to have it. The whole customer experience has to be this user-driven model where they go to an app and they select services, and then the activating is done. I think everybody now understands that that's kind of table stakes. So SDN is not even really an optional thing anymore. This is going to be the way it's done going forward. Especially when you're trying to unify mobile and fixed environments and things like that. You're going to have to have a lot of that programmability, and SDN will help you get there quicker.
The other aspect about capex savings that is starting to become a little more clear is now you can disaggregate and merge some of these functions. We used to have different development silos like access transport and subscriber edge devices and router groups. Now those kinds of things are consolidated and there's a lot more capex savings to be had.
FierceTelecom: What are your thoughts on how the industry will use machine learning and artificial intelligence?
Conger: Well, yes, definitely there are a lot of areas where machine learning and AI type applications can help. People are doing a lot of work to get more telemetry data out of their devices. A lot of the initiatives around the people programming next-gen SDN and stuff is "How do I get streaming telemetry on all these devices?" So I'm getting all the packet information, and packet counters and getting it all in real-time and not just alarms. It used to always be you'd get alarm events and stuff like that. Now it's all about real-time data. And that reason for the real-time data is then you can put it into some big data lake and then you can run algorithms on it.
You try to put automated streaming telemetry on every device to get as much information as possible off of your network, put it into a big data lake, and then apply different algorithms to proactively identify faults, or correlation of events. Especially in things like radio networks.
We've got really complex environments, like core copper access nodes, where you have have to look at three or four different factors and correlate those together to really identify the root cause analysis. That's where things like AI and machine learning could really go way above and beyond what you can do with normal scripting. I mean, yes, you can brute force your way through some of those correlations if you knew exactly what to look for. If you knew what the event correlation was, then you could write scripts to do it. But as you get into more complex things where you don't really understand all the correlations or how they correlate, then that's where AI comes in and it can help you draw the correlation.