The data-driven smart city: Tackling the challenges

Milton Keynes is the fastest-growing city in the UK, with its population set to grow from 255,700 in 2013 to over 300,000 by 2026.

This means that, like many cities, its infrastructure is under strain. Milton Keynes needs to find ways to support sustainable growth, while meeting key carbon reduction targets.

Central to this drive is the creation of a state-of-the-art MK Data Hub, as part of the government-funded MK:Smart initiative. And this platform is the focus of a TM Forum Catalyst project.

Catalysts are rapid-fire, member-driven proof-of-concept projects which connect ecosystem players to work together to find innovative solutions to common industry challenges. There are two main roles in a Catalyst – champions (usually an organization seeking a solution to problem) and participants (organizations contributing to solving the problem). The City of Milton Keynes, BT, The Open University and NRECA are the champions here, working with BearingPoint/Infonova, Huawei and Cloudsoft.

The MK Data Hub

The MK Data Hub brings together the whole smart city ecosystem on one platform, including end users, service providers such as energy and water companies, government, sensor network providers, data providers, and developers.

All the sensor data, from traffic flow to waste bin/trash can information, is pulled into the Data Hub. Developers can then access and use it to create smart applications. Other parties, like government departments, can also pull useful reports from the data — to see how resources can be managed more efficiently, for example.

The first iteration of this Catalyst looked at monetization opportunities in smart city ecosystems — the Data Hub allows each party to define how they want to make money from their data.

In this latest round, named Service Level Management for Smart City Ecosystems and Trusted IoT, the team is looking at testing commercial viability and trust – both of which will be essential to making a smart city data economy a reality. They have introduced:

  • Service Level Agreements

The new data economy means that applications and businesses may well rely on the data being provided through the Data Hub. The team is looking at how to ensure the data platform is stable and reliable. They are also demonstrating how to enable data providers to offer ‘classes’ of service, e.g. charging more for higher-speed data delivery. Should the data provider fall short of the agreement, they will show how compensation and penalties can automatically be calculated and applied.

  • Trust, privacy and security

Data providers offer data feeds with associated terms and conditions and the Data Hub ensures data consumers abide by these. Additionally, data providers can impose privacy rules on their data feeds, assigning who is authorized to access their data and with what conditions.

This iteration of the Catalyst is also focused on ensuring that data collection is secure, meaning that only authorized devices (e.g. sensors, gateways) can connect to the platform. ‘Rogue’ devices, which could send false data, will be rejected.

You can see the Data Hub in action at TM Forum Live! in Nice in May.

Contact Nektarios Georgalas of BT to find out more.

Andreas Polz of BearingPoint gives a brief overview of the Catalyst in this video interview:

You might also like:


    About The Author


    Sarah is a freelance writer and editor with an interest in new technologies and how they impact our everyday lives.

    Leave A Reply

    Back to top