A recent podcast on Telecoms.com explored the concept of “dirty data,” which might be a bit of a misnomer but in this particular case, the data referred to as “dirty” because it is inaccurate or misleading. Below we are going to dig a little deeper into the key themes discussed in this podcast and their potential impact for managing a fiber network.
There are a number of categories of systems that a telecommunications company operate to maintain records. At a high level, these systems are defined by the following acronyms, though naturally, their existence, functionality, and user groups may vary at each organization.
- Operational support system (OSS): the tools, products, and systems that enable the network to operate
- Business support system (BSS): enable the business operations, sales, marketing, and fulfillment
- Physical network inventory (PNI): looks at what assets are out in the field and how they are being used
Assuming that an organization has purchased or built each of these systems, a number of questions arise: Which teams should have access to each system? Who will maintain the data in each of them? How will they talk to each other? These questions seem easy, until a great amount of customization has taken place in each system to most successfully support each of the unique products and technologies that are being used throughout the organization.
From their implementation, then fast forward ten years and it’s now time for that organization to upgrade to the newest, fastest technology as market opportunities are demanding more bandwidth and less latency. What does this require? More new systems, of course. Those new systems are thrown in the mix with the old ones; the same questions and customizations arise; and the cycle continues over and over, increasing in complexity and becoming more challenging to manage each time.
The evolution of data collection
As technology has evolved, so have the use of processes and systems at telecom companies and their partners. What started as a drafting-like system - think WWI Zeppelin intelligence - that utilized paper where records were created and stored to be mailed, was transformed by the invention of computing. The process was quite naturally digitized, generating the rapid demand for engineers to design networks. But that CAD data was still like its paper predecessor...flat.
Adoption of geospatial information systems in many industries and municipalities, enabled the creation of geolocated maps and the unification of multiple data sources. Building on this geometric representation, logical data connects the digital representations of the physical objects in the manner they are connected. Presently, telcos may find themselves at any stage within this data evolution across multiple geographies and business units, leading to varying qualities of data across each organization as well.
The end goal in implementing these new technologies within an organization is to achieve FAANG like functionality - incorporating automation, AI in operations, utility of technology, etc. - throughout the network. However, in order to effectively use automation in many aspects of the business, it is paramount that the company knows precisely what assets it owns and how each is being utilized - meaning they have to have “clean data.”
In order to further understand the impact of the data, and more precisely, the importance of eliminating dirty data, we need to examine how the data is being used.
> Network design: Some of the data required includes existing assets (poles, conduit, vaults, etc.) as well as customer locations. If this data is incorrect, the network design will, in turn, not be correct, creating many delays in the timeline and many expansions of the budget.
> Sales: With the growing demand for fiber connections and businesses and homes alike, operators need to know which strands are connected, where they are connected, and what wavelengths are in use. The question of whether a customer can be connected to a specific fiber circuit can only be answered accurately if the sales team has access to correct network data. This will also enable them to more precisely estimate costs and timelines to potential clients.
> Network maintenance: Service goes out and operators scramble to determine the cause and location of the fault. In order to rectify an issue quickly, the location of the outage and the impact to the network as a whole needs to be known.
All of these usages rely on quality data, and with the ongoing evolution and digital transformation of telecom, it is crucial for organizations to assess the quality and potential impact of their data. Because, as the podcast points out, if you don’t have accurate network data now, how can a software configure a proper network that provides the right services to the right customers? It all goes back to the old saying: garbage in, garbage out. In order to have truly successful autonomous networks, the machines have to be fed the correct data.