About 8,000 segments of fiber cables – this is the volume of the automatic data migration currently performed by our team for one of the largest telecommunications operators in the United States. We’re mapping data from the legacy network inventory solution to GE Smallworld Physical Network Inventory (PNI) system, implemented by GE in cooperation with Globema.
Smallworld PNI with Globema Physical Route Manager
It all began when the end customer decided to replace the legacy network inventory system and implement Physical Network Inventory (PNI). Globema Physical Route Manager (PRM) add-on is also part of the deployed solution – PNI was enriched with the PRM module to provide a convenient way of describing connections between locations.
Data at the single fiber level
PNI implementation covers also data migration from the previously used solution. The legacy system contains two types of information – on long-distance networks and metropolitan urban networks. Both types include data covering information about fiber optic cables, termination points and patch panels.
We have to map the cable route
The legacy network inventory system contained only information about beginning and end of the cables. Information about the cable route was stored in another solution in Shapefile format (SHP). As a part of the data migration our team must associate all the information to map the actual cable route.
Automatic data migration of 8,000 segments of fiber cables
We’re migrating about 8,000 segments of fiber cables into PNI. Data migration is fully automatic – our team is developing special data loaders. We’re also using FME for comprehensive geometric data transformations. During the project we’re also facing data inconsistency which is an additional challenge for the migration.
Did you know?
We loaded into Smallworld information about telco network used by New York Stock Exchange
We’ve completed over 200 data migration projects
What are the key success factors for data migration projects?
Proper estimation of migration volume. You need:
- complete list of migrated objects
- all data sources identified
- well defined target structure
- verified data completeness.
Well-documented data sources
- what they contain, how are they used, what’s their hierarchy.
Data uniformity verified (before project kick-off)
- data diversity evaluation, all special cases examined.
Data quality verified (before project kick-off)
- quality tests on source data
- any issues found? If so, would it be hard to correct or enhance data?
- treat data as a one piece – take into consideration data coherence and mutual dependencies of the information.
Automatic data quality verification
- detection of data duplicates
- enriching and cleaning data on a regular basis.
Perform test data migration procedure
- …and acceptance tests afterward.