BUILDING A SMART CITY INFRASTRUCTURE: THE KEY ROLE OF APIS AND MICROSERVICES

  • 19th July 2018
Featured Image

How APIs, microservices and visual interfaces can be best used to build the smart city of tomorrow

Smart cities are big business. According to a report published in July 2017 by analyst MarketsandMarkets, the global smart cities market is projected to grow from $424.68 bn in 2017 to $1,201.69 bn by 2022 – a compound annual growth rate of 23.1 per cent.

Sensor-based Internet of Things (IoT) technology is underpinning this growth, and an important aspect of this is streetlighting. Existing streetlights are a key pillar of smart city development as they are already connected to a power source and their shape and height allows them to perform the role of an antennae for the sensor network.

Other sensors can then piggy-back on the streetlight network, connecting with each lamp to send their data using low-power communications. This is enables a smart city network to be developed without the authorities having to install an extensive new array of powered sensors across the city.

By utilising the existing infrastructure cities can realise significant environmental and operational benefits, while saving public money. Typical applications include improving drainage systems; ensuring bridges are safe and looking after green spaces.

The success of such endeavours is dependent on the effective analysis of vast volumes of sensor data. That makes it critical to implement a technology architecture capable of handling mass data flows – and supporting the smart city of tomorrow – by enabling the collation, ordering and visualisation of this data; data which might relate to pollution statistics, road surface conditions or drain levels.

The role of APIs and microservices

Here, we review how three critical areas of this architecture – application programming interfaces (APIs), microservices and visual interfaces – can be best used today to build the smart city of tomorrow.

On the sensor side of the process, the city authorities will typically use specialised APIs as the providers will generally be building data models that are specific to the sensor data. However, on the application side APIs must be much more flexible in order to bring multiple data sources together in a single system. If the application interface is too rigid, new API endpoints will need to be developed to pull each new type of data into the system, adding cost and time to the process of integrating sensors into a single central data hub.

However, irrespective of how versatile the application-side APIs are, there will always need to be some level of transformation of the data from device-specific APIs into delivery application ones. Moreover, this transformation needs to be carried out at scale and on demand. That is where a microservices approach comes in, in which large applications are developed from a suite of modular components or services.

One key role of microservices is as a data filter. If you consider the vast data volumes being collected by smart cities through sensors, microservices can help to filter inconsequential data and then transmit significant data to the right places, which then allows data analysis to happen at a more general level.

For example, an authority may have implemented multiple sensors to measure temperature variations across the city. Microservices and event-driven processing can provide a valuable service here by reducing multiple measurements into key notifications of predefined threshold being exceeded. It is easy to imagine similar approach being used to monitor noise and air pollution too.

Scalability is a big advantage of microservices. They can be quickly scaled up – when multiple sensors all decide to send data at the same time, for example – but they are relatively inexpensive because they typically run for only a few seconds at a time and users only pay for the time they are in use.

Another plus is that microservices are easy to deploy. They are also typically easy to write and maintain.

Data analysis and operational workflow

Once the microservices have completed their work, the data is passed through smart city application APIs for processing. At this point, the application needs to have a strong visual interface that helps the authorities understand and make sense of the data that has been collected.

That’s where the visual element of the interface in this connected asset infrastructure is so important. Dashboards need to be in place. Strong iconography and colours can be used to differentiate between data items or link one item with multiple other similar items for example. To further drive user engagement, user experience and usability elements can be built into the interface further encouraging users to interact and play around with it, establishing patterns, analysing the results of data enquiries and driving new insights.

Insights alone are, however, of little value in building the smart city unless they ultimately result in concrete actions being delivered. To close the loop, rules need to be put in place that trigger immediate actions, such as a service engineer call if a street light fails, or a maintenance visit if a drainage gulley is overflowing.

This workflow element is critically important to the success of any connected asset management approach within the smart city – and it must never be neglected if the data analysis carried out by the system is to result in tangible operational efficiency, environmental and safety benefits across the city as a whole.

Fulfilling the potential

The potential for smart cities is clear. IoT technology is helping to drive enhanced connectivity between assets. The authorities are using an architecture consisting of microservices, APIs and visual interfaces to make use of this connectivity to collate, order and visualise key data. By analysing patterns and trends in this data they can achieve insight into a wide range of issues affecting the smart city and use that insight to make cities safer, more productive and generally better places to live.