Why Internet Service Providers are Exploring Open Source

• • Broadband, industry trends, Telco

When you set up a network, the fiber optic cable, media distribution enclosures and keystones you use don’t lock your customer into using only compatible equipment. That can’t be said for the network infrastructure beyond your customers’ homes, but that may change soon.

The majority of Internet Service Providers (ISPs) use proprietary software that requires compatible hardware to manage their networks, but open source software is gaining traction in this space and has the potential to change how telcos operate.  Proprietary software is based on code that is owned by a developer and licensed to users for a fee. Open source software is not owned by any one developer and is freely available to anyone who wants to use it.

Why open source?

Until recently, the common belief was that proprietary software offered a better user experience and security, but this thinking is shifting for a number of reasons. Cost and agility are two major factors. The deployment of open source solutions is faster and less expensive. Software that is not locked in with a particular vendor can keep up with the pace of technological change. Code that anyone can use is programming that everyone can use to build new technologies.

Open source code is built by groups of developers who collaborate online using a system such as GitHub to update the code. Because these developers collaborate in a virtual environment, projects are not influenced by company politics or goals. The developers working on open source are students and paid professionals, experts in their area who are passionate about what they do. The goal is always to improve upon the code.

Security used to be one of the main arguments against using open source, but this is no longer the case. With a much larger developer pool looking for and fixing vulnerabilities, open source code is more secure than proprietary software with a limited number of developers responsible for security.

ISPs and other telcos are looking to open source software for the development of Network Function Virtualization and Software Defined Networks (NFV and SDN), architecture that can deliver faster, more reliable network services to their customers.

Origins of open source for telecommunications

The Linux Foundation powers 95% of the world’s supercomputers and boasts a developer community in the thousands. It began in 1991 when Linus Torvalds, a computer science student, launched a free operating system called Linux. Since then the Linux Foundation has been involved in countless open source projects.

In 2013 the Linux Foundation hosted the OpenDaylight Project, a collaborative framework for developing SDNs and NFV. The project was supported by tech giants like Ericsson, IBM, Cisco and Hewlett Packard. Large companies support these projects with funds and by assigning their developers to work on the code. By 2016 half of the OpenDaylight projects were being proposed by user companies like AT&T and Comcast.

AT&T and Telus testing open source

AT&T has 100 developers working on OpenDaylight and has invested heavily in Open Networking Lab (ON.Lab), a non-profit that released the Open Network Operating System, (ONOS).

AT&T recently announced plans to launch a trial 10 Gigabit symmetric passive optical network with open source SDN capabilities.  The technology that AT&T is using for this trial includes a combination of OpenDaylight and ONOS code.

OpenDaylight and ONOS both use a modular architecture to build more flexible and responsive networks. But they each have their own specialty. OpenDaylight is designed primarily for data centers and ONOS is designed for ISPs. Also, OpenDaylight is focused on merging established networks with SDN capabilities.

Telus has teamed up with Fujitsu and the Center of Excellence in Next Generation Networks (CENGN) to test Fujitsu’s Virtuora Network Controller, which uses open source code to provide SDN and NFV functionality.

The transition to open source is great for you and your customers because it means the networks of the future will be built on code designed to adapt to new technological requirements. And with the rate technology is changing, that can only be a good thing.