Designing for smarter cities with mixed reality

How can we promote collaborative governance and urban sustainability when planning for the cities of the future?

We believe that mixed reality offers interesting opportunities to engage citizens who are normally excluded from the planning process, and whose participation is important to foster urban sustainability.

Many contemporary urban challenges are best understood as complex. This means that they are often a confusing mess of interrelated problems which are difficult to define and are often disputed. This is why it is absolutely crucial for cities to promote citizen participation in urban planning. Citizen participation is important in order to mobilize knowledge, innovation and support for solving some of the most pressing sustainability issues of our age. Yet in reality urban planning often lags behind urbanization and lacks effective means of collaborating with citizens.

This is why we have decided to take on the challenge of thinking creatively around how ICT could foster citizen participation in urban planning and ultimately urban sustainability. By merging virtual and real world objects to produce new environments and virtualizations, mixed reality presents yet unexplored opportunities to build urban sustainability.

We have therefore developed a number of concepts for a mixed reality platform which could support any city to plan for the future.

The concepts present ways to visualize a city as it looks today as well as future urban plans, including buildings, parks or infrastructure that are not there today but might be so in the future. Equally important they present ways for citizens to explore alternative futures and urban data, including impacts of existing urban plans on dynamic elements like traffic, noise, air quality, services, and so on. As we think that connectivity is a precondition for radical innovation we have also tried to visualize how such a platform could support communication and collaboration among various stakeholders.

Based on field research, we think that mixed reality could help citizens to overcome some of the high barriers associated with citizen participation. By dissolving time and space mixed reality can enable people with difficulties to be present at a certain location and time to participate in urban planning. Families with young children or people with restricted mobility are example of groups whose participation could be supported. Certainly mixed reality presents interesting opportunities to foster engagement among youth. Supporting their participation is important for a number of reasons, not least to enable inter-generational dialogue and to foster young people’s self-esteem by reinforcing their sense of being an important part of their communities. It is therefore intriguing to ponder how mixed reality could create an enabling environment for participatory behavior, making it easier, cheaper and faster for citizens to engage with planning issues.

 

What will the cities of future looks like?

As we at Ericsson want our stakeholders to use our technology in ways that support democratic stewardship of cities, it is interesting to note that mixed reality can allow every individual to speak about his or her desires for the future (provided citizens have the digital skills required). We believe that when provided a safe space to speak, the self-confidence to participate in urban planning can increase.

New digital technologies like mixed reality also create ample opportunities to strengthen the transparency of urban planning. While this can enhance two-way communication and install an impetus for improved accountability, it can also empower citizens to make informed choices. This could help to improve the quality of citizen dialogues as well as decisions making to produce outcomes that are more likely to contribute to sustainable development.

Based on our research we therefore believe that mixed reality can enable new interesting interactions between cities and citizens, and support collaborative sense-making of urban challenges. While this could have a profound impact on inclusiveness and help to improve the representativeness of the planning process, it could also strengthen cities’ problem-solving capacity and responsiveness to complex urban challenges.

Multifaceted information and collaborative governance have to be recognized as key resources when designing and managing our cities. Looking ahead this will certainly be crucial to legitimize and create public support for smart city adoption.

 

Autors: Anna ViggedalCristian NorlinFanny von HelandJoakim FormoMarcus Nyberg.

Five takeaways from Keysight’s 5G Tech Connect conference

Last week, I had the opportunity to attend Keysight’s 5G Tech Connect conference near San Francisco. It was a private one-day event that included many 5G thought leaders. The speakers and panel participants ranged from researchers to service providers, who provided insight into the issues presented by 5G.

The issues were oriented towards the development and testing challenges, but also included some interesting applications. There was a room nearby with Keysight test equipment, but it was far from a sales event. In fact, Roger Nichols (Keysight 5G Program Manager and the event host) promised to boot out anyone filling out a lead form. Many of the new test products shown are still under wraps, you’ll have to wait until 2018 to hear about them. But, the discussions were riveting, and the small crowd of about 200 allowed full audience participation.

Given that, here are my top five takeaways about 5G.

Over the air testing becomes essential
For years, the cellular industry has relied on conducted measurements to validate RF parameters in devices and base stations. 5G mmWave will end that. There won’t be any connectors because phased array antennas span a device. Instead of conducted power, Equivalent Isotropically Radiated Power (EIRP) becomes the key specification. EIRP is essentially the transmitter power multiplied by the antenna gain compared to an isotropic antenna. Measuring EIRP typically requires a far-field anechoic chamber (Figure 1).

Figure 1 For R&D testing in a typical far-field anechoic chamber, the device-under-test (DUT) is mounted on a positioner that rotates in two planes. But how feasible is this in the manufacturing environment? Note that shorter wavelengths lengthen the size of the chamber due to the formula above. Image courtesy of Keysight Technologies.

OTA will impact the entire testing chain, not only how test is performed, but where as well. R&D may be able to afford far-field anechoic chambers for development, but people in the manufacturing supply chain will have to think this through. Test every device in an anechoic chamber? Go to near-field testing to reduce chamber size? Drive six-sigma quality through the supply chain, and merely test of the final device functions? All these options were discussed during the conference. Lucas Hansen, Senior Director for Chipset and Component Testing at Keysight, opined that the first manufacturing runs would require some anechoic testing. Later, devices would deploy self-testing or DUT-assisted testing techniques to eliminate this requirement.

Is this feasible? Perhaps. Dr. Gabriel Rebeiz showed the results of phased-array antennas “built like digital boards” at UC San Diego (Figure 2). Solder in the components and go. They delivered impressive performance and repeatability, even without calibration. He emphasized how modern foundries and manufacturing methods have “made phased-array easy” and noted that one particularly impressive array was created by just two students. Students, he reminded us, have other priorities on campus, often social, yet still pulled this off.

Figure 2 This 64-element phased-array antennas developed at UC San Diego showed results showed high performance and repeatability, even without calibration. Image courtesy of Keysight Technologies.

R&D plays a proportionally bigger role
Beamforming adds a great deal of complexity to 5G. It’s not just the physical characterization of the devices, or the measurement of the beam patterns—it’s the entire set of protocols to even know where to aim the beam. Moray Rumney, Keysight’s representative on the 3GPP radio committee, walked us through the ladder diagram of how a device is bonded to a base station while both units are aligning their beams. Will it work? Maybe. Fixed Wireless Access (FWA) will be easier as both devices are essentially motionless. Mobile communications is, however, much harder. These tests are all in the R&D area. Once the beamforming algorithms are developed, there is no manufacturing test.

Low latency testing is another R&D-centric test. 5G includes a specific use case called Ultra-Reliable Low Latency Communication (URLLC), which promises to enable safety-critical applications such as remote surgery or autonomous vehicles. The one-millisecond latency goal is needed for enabling any application that requires tactile feedback.

Shown at the conference, and announced this week is the Keysight UXM 5G wireless test platform (Figure 3), which performs beamforming algorithms and latency tests. It is essentially a network emulator for device software testing. While the UXM isn’t new, up until now it tested devices to the Verizon pre-5G Technical Forum (5GTF) specifications. This week, Keysight announced that the New radio (NR) specifications have been incorporated into the tester, matching the long-term 5G radio specifications.

Figure 3 Keysight unveiled the UXM 5G wireless test platform. The UXM now supports the NR waveforms and tests for Advanced channel bandwidth, beamforming, latency, 8CC aggregation, and a comprehensive set of L1/L2 actions. Photo by Martin Rowe.

The silicon wild card
Can silicon be effective at mmWave frequencies? Will silicon’s lower efficiency eliminate it for power amp applications?

The pre-event dinner featured a keynote by Maryam Rofougaran, Co-CEO and COO at Movandi Corporation. Rofougaran led us through her history of designing silicon RF devices that had previously been thought of as impossible. After her first start-up Innovent Systems was acquired by Broadcom, she was promoted to senior VP of Radio Engineering, and led a worldwide team of more than 300 engineers at Broadcom developing wireless radios for combination chips. She and her brother recently cofounded Movandi to create high performance mmWave devices using low cost bulk CMOS processes. Movandi recently announced the BeamX, a complete 28 GHz front end module, from the phased-array antenna to the baseband interface. Movandi seeks to deliver 4.5 db better link budgets while consuming 30 percent less transmit power (Figure 4).

Figure 4 The original Movandi BeamX prototype is a 64-element phased array antenna based on bulk CMOS processes. Image courtesy of Movandi Corporation.

The jury is still out on silicon versus III-V processes, but it is clear silicon is making gains.

Wait for mobility
Beamforming is hard. Beamforming with a quickly moving object at mmWave frequencies is even harder. I’m sure it will be solved (see my next takeaway), but fixed beamforming will be solved first, and will be an industry-wide learning experience. A year and a half ago I predicted that fixed wireless access would be the first 5G killer application. It’s the combination of lower degree of difficulty and known business opportunity that keeps me sticking with this prediction. Verizon has already announced its intent to do exactly that in 2018.

Figure 5 Delivering fixed broadband using wireless technology will be Verizon’s first commercial deployment of 5G technology. Image courtesy of Verizon.

There are a number of unknowns that will need to become “knowns” for mmWave mobile to work. Beamforming at speed, over-the-air performance and testing, and finally what business model and applications can justify billion dollar investments in mobile remain challenges. If 5G mobile is deployed this decade, expect it to be in the sub-6GHz spectrum using massive MIMO.

Massive innovation
5G represents the most disruptive generational change in cellular networks since the movement to digital. Everything is hard- higher frequencies, ill-behaved propagation, beamforming, and over-the-air testing. We don’t even know what we don’t know yet.

That said, the pace of 5G innovation is simply breathtaking. Whether the business case demands this or not, an ecosystem ranging from academia to component suppliers to carriers has emerged with giant investments behind them all. Perhaps it’s the fear of being left out. Like a western town in the late 1800s, the players feel compelled to belly up to the bar and place their pistols on the counter. The conference ended with a talk by Dr. Mischa Dohler, from King’s College London and head of the Centre for Telecommunications Research. It was an inspiring talk where Dohler led us through new applications being conceived and prototyped through his research. While Industry 4.0 empowers robots, he has dubbed Human 4.0 for empowering humans. Remote stroke detection in ambulances, exoskeletons training new surgeons from a master, and remote practice of the performing arts were just some of the examples he presented.

Figure 6 With 5G communications, a surgeon could wear a sensing glove while performing remote surgery. Image courtesy of Ericsson.

Figure 6 With 5G communications, a surgeon could wear a sensing glove while performing remote surgery. Image courtesy of Ericsson.

Empowering humans, indeed.

Author: Larry Desjardin served in several R&D and executive management positions with Hewlett-Packard and Agilent Technologies.

Posted in LTE

How will the transportation system benefit from IoT-enabled platooning?

Platooning is an innovative transport system where trucks can drive closely together – one after another – using a common communication system based on smart technology. This could lead to benefits for the transport system with regards to safety, efficiency and the environment.

Ericsson and Scania have started a collaborative research effort to accelerate the connectivity of commercial vehicles and infrastructure. Truck platooning is just one example of this and we have assessed it in terms of the sustainability impact. We wanted to know how – and to what extent – does ICT have the potential to improve the impact of truck platooning when it comes to safety and efficiency?

In a more basic form, truck platooning could be based on an Adaptive Cruise Control system, comprising onboard radar and other electronic equipment here referred to as conventional technology. This is where each truck optimizes its behavior adding Vehicle-to-Vehicle (V2V) communication enabled by ICT. The aim is to enable further fuel savings and reduce in carbon dioxide emissions through shorter distances and associated platooning.

Conventional technology (Scenario A), based on Adaptive Cruise Control, radar and other electronic equipment, can be used to drive in platoons for 25% of the full distance. The potential fuel reduction is calculated to be 2% based on Scania test track driving. In a theoretical scenario where platooning based on conventional technology would be used during the whole distance the fuel reduction potential would be 8%.

In a theoretical scenario with V2V communication, corresponding to 100% of the distance be driven in a platoon, the fuel saving potential is estimated to be 12% according to Scania for the investigated set-up (compared to the 8% for conventional technology). In practice, it is unlikely that the full distance would be suitable for platooning. However, V2V communication enables a more extensive use of platooning than the solution based on conventional technology. We thus assume that 50% of the whole distance is driven in platooning mode (scenario B). The overall 6%saving, represent an increase with 4% units compared to the 2% saving without the use of ICT. This translates into a reduction of 4 tonne CO2 per truck and year. The non-platooning reference scenario is based on an average truck in the EU that travels about 100 000 km/year with a fuel consumption of about 0.25 l/km which corresponds to emissions of about 66 ton CO2/year. A 4 tonne saving in CO2 could be compared to the average annual CO2 emissions per citizen in the EU of 7.4 tonne in 2012 (including emissions from transportation).

To understand the net impact of V2V communication, the additional saving enabled by it is compared to the additional footprint of the communication solution which gives an increase of 0,14% per truck (a factor of only approx. 1:30 compared to the estimated saving), indicating a substantial net reduction in carbon emissions.

How will the transportation system benefit from IoT-enabled platooning
How will the transportation system benefit from IoT-enabled platooning

From social perspective, truck platooning could have a vast range of impacts, from driver satisfaction due to common breaks along the roadside and more social interaction, to feeling passive during the drive and possible stress due to the threat of job losses due to automation. Safety could potentially also be increased or decreased depending on the perspective, i.e. the trucker driver or the surrounding traffic.

Platooning is not yet deployed and wide-scale tests, technical feasibility and upscaling are under development out to 2020. In order to use public road networks, further tests and legislation is necessary to prove safety and reliability. A greater level of automation and legislation for a broad commercial application with automated driverless trailing vehicles is estimated for 2030 and beyond.

Author: Sepideh Matinfar

Beamforming to Expand 4G and 5G Network Capacities

Most wireless subscribers believe all is well with their network coverage. The wireless industry knows the future tells a different story. 4G LTE has reached the theoretical limits of time and frequency resource utilization, while 5G will need new technology to meet its full potential.

The wireless industry is working feverishly to open a new degree of freedom and space for enhancing network capacity and performance to address growing connectivity demands. Engineers are looking at spatial dimension innovations, falling under the category of space division multiple access (SDMA), that will help deliver significant network capacity and performance.

Keeping pace with demand
Keeping pace with demand

With SDMA, the idea is to use software-driven, beamforming antennas to enable multiple concurrent transmissions using the same frequency without interference, thus allowing for abundant spectrum reuse with higher intensity signals delivered to both stationary and mobile users. This way, mobile operators can continuously reuse the same band of spectrum, at the same time, within a given spatial region, and direct coverage to where it’s needed, when it’s needed.

Wireless carriers and OEMs are considering two technologies that enable electronic beamforming to 4G and 5G networks to meet the boundless growth in wireless data consumption: multiple-input and multiple-output (MIMO) and beamforming.

Early MIMO deployments in 4G systems have been both exciting and disappointing. Exciting because real network capacity gains have been shown. Disappointing because hardware costs have outpaced performance gains. That is, scaling and costs have been sharply sublinear. Despite impressive near-field spectral efficiency achievements like those from the University of Bristol in 2017 (130 bps/Hz), the lack of applicability to far-field systems such as cellular suggests that single user MIMO (SU-MIMO) has maxed out.

Enter MU-MIMO

That leaves multi-user MIMO, where independent data beams are transmitted along diverse vectors. MU-MIMO is not without challenges, however. Practical MU-MIMO demos have shown that it is difficult to achieve linear capacity gain with the number of antenna/radio pairs used. In practice, the observed capacity gains have been more like one-tenth the number of radio/antenna combinations. The reason for this is obvious. Users are rarely spaced on an angularly uniform grid and so the use of so many radios results in overkill. Reducing the radio count does not help as the beams widen, thus exacerbating the problem.

More recently, attention has been drawn to MU-MIMO power consumption in cellular bands. Several researchers have pointed out that multi-GHz clockrate 8-bit ADCs (analog-to-digital converters) require significant power. For a 128-element MU-MIMO array this implies at least half a kilowatt of power needed just for the ADC components. The dissipated thermal load is substantial, which in turn drives cooling requirements, resulting in a heavy, bulky, power hungry, and costly system for MU-MIMO. It remains an open question if the cost of 128 radio chains is justifiable for 10× improvement. This situation does not get better in millimeter-wave bands where even larger arrays are needed for sufficient antenna gain while power amplifier efficiencies plummet to under 5% at 60 GHz.

Holographic beamforming

Holographic beamforming (HBF) is a new technique that is substantially different from conventional phased arrays or MIMO systems in that it uses software defined antennas (SDAs). It is the lowest C-SWaP (cost, size, weight, and power) dynamic beamforming architecture available.

HBFs are passive electronically steered antennas (PESAs) that use no active amplification internally. This leads to symmetric transmit and receive characteristics for HBF antennas.

Where phased-array type PESAs use discrete phase shifters to accomplish beam steering, HBFs perform the task using a direct amplitude hologram. Figure 2 shows two different digital overlays on the HBF representing the bias states of the varactors generating the hologram. The hologram in Figure 2a steers an RF beam in one direction while the hologram in Figure 2b steers the beam to broadside.

(a)-HBF-with-color-overlay-of-the-hologram-used-to-steer-the-beam-off-broadside-(b)-HBF-with-color-overlay-of-the-hologram-used-to-steer-the-beam-to-broadside

All components used in the construction of HBF antennas are high-volume, commercial off-the-shelf (COTS) parts. These incredibly low-cost control components take advantage of their widespread use in handsets, leading to economies of scale that silicon implementations can only dream of.

Equally important, the beam pointing function is accomplished using a large array of reverse biased varactor diodes. This leads to a nearly negligible power draw by the antenna’s pointing operations. Most HBFs need only USB or PoE (power over Ethernet) levels of power to operate. This then eliminates the need for active or passive cooling solutions and drives a significant size and weight reduction.

MIMO uses antenna/radio pairs to achieve beamforming with a very complex baseband unit coordinating the system. Holographic beamformers have simple control and use more densely packed antenna arrays. Roughly 2.5-3× as many elements are used by HBF systems. Fortunately for HBF, the control elements needed are trivially priced. These differences are summarized in Figure 3.
Summary of key differences among holographic, phased array and MIMO beamformers
Figure 3 Summary of key differences among holographic, phased array and MIMO beamformers

The benefits of beamforming will not materialize in the commercial market without the low C-SWaP architecture that only HBF provides. MIMO’s C-SWaP is exorbitantly high. HBF represents a breakthrough beamforming technology that finally provides a viable C-SWaP profile for commercial 4G and 5G networks.

Author: Eric Black

Posted in LTE

Major breakthrough on mmWave propagation and channel modeling

Understanding radio propagation in the mmWave frequency range is vital for the development of 5G. Ericsson Research experts, in partnership with European researchers in the mmMAGIC project, have performed extensive mmWave channel measurements and modeling for a multitude of scenarios. Let’s look at our key achievements.

Present cellular communications systems utilize frequencies below 6 GHz. The frequency range 24-86 GHz – which is subject for allocation within International Mobile Telecommunication 2020 (IMT-2020) spectrum at the World Radiocommunication Conference 2019 – is however mainly in the mmWave range. Compared with below 6 GHz, loss in radio signal due to absorption in materials or blockage by buildings, vegetation, vehicles, and humans is expected to be substantially different in the mmWave range. Moreover, important radio channel characteristics such as multi-path delay spread and directional spread have previously been poorly understood in the mmWave range.

To overcome this knowledge gap, the mmMAGIC project (co-funded by the European Commission’s H2020 program) has undertaken a major effort in the area of propagation research with extensive measurement campaigns performed over 2-80 GHz for a multitude of different indoor and outdoor scenarios. In addition, researchers have performed simulation campaigns in selected popular environments and frequencies to provide a large data set of propagation channels for the purpose of channel modelling. Overall, 54 single-frequency equivalent campaigns have been conducted. An overview of these measurements and simulations is depicted in figure below.

Requirements for channel measurements

The assessment of any frequency dependency over the measured range is key in the development of 5G mobile communications and spectrum allocation. To ensure comparability between channel measurements at different frequencies, a set of important requirements have been established:

  • Equal measurement bandwidth
  • Equal antenna pattern, either physical or synthesized
  • Equal dynamic range for analysis both in delay and angle domains
  • Equal angle resolution (for example, array size equal in terms of number of lambda)
  • Same environment and same antenna locations

The measurement data in the mmMAGIC project has been thoroughly analyzed assuring that the above requirements were fulfilled.

Key results and contributions in channel modeling

Based on channel measurements and thorough analysis, the key characteristics of mmWave propagation can now be largely understood. All details are publicly available in the project’s final report Measurement Results and Final mmMAGIC Channel Models. The key results have, to a large extent, already been incorporated into propagation models in 3GPP and ITU-R, as listed below:

  1. Extensive high quality measurement data contributed to 3GPP 5G channel modeling.
  2. Measurements and modeling of building penetration loss used as substantial input to 3GPP and ITU-R models.
  3. A substantially improved blockage model adopted by ITU-R.
  4. Addition of ground reflection added to ITU-R IMT2020 channel model.

Thorough statistical analysis (determining confidence ranges shown in the figure) of frequency dependency of delay spread and angle spread, with no significant frequency dependency observed, in contrast to the less thorough result of the 3GPP modelling effort.

Indoor measurement of angle spread in line of sight (LOS) and non-line of sight (NLOS)
Indoor measurement of angle spread in line of sight (LOS) and non-line of sight (NLOS)
Frequency dependence of RMS delay spread of the 3GPP and mmMAGIC channel models. The confidence ranges of the mmMAGIC model are indicated with dashed lines.
Frequency dependence of RMS delay spread of the 3GPP and mmMAGIC channel models. The confidence ranges of the mmMAGIC model are indicated with dashed lines.

 

mmMAGIC project

The mmMAGIC consortium comprises of 18 organizations in Europe (Samsung, Ericsson, Huawei, Nokia, Alcatel-Lucent, Intel, Orange, Telefonica, Keysight Technologies, Rhode & Schwarz, HHI, CEA-Leti, Imdea Networks, Bristol University, Chalmers University, TU Dresden, Qamcom, Aatlo University). For details on the mmWave radio interface design, please visit the project webpage.

Author: Ali Zaidi

TOWARDS APPLICATIONS OF FOG COMPUTING FOR DESIGNING SMART CITY VERTICALS

fog communications

In this digital era, governing bodies and infrastructure planners are designing initiatives to transform cities into smart cities. In order to make cities smarter, they are using digital technology in each domain of smart city such as transportation, traffic management, safety, energy and much more.  Some of the urban services which are governed by the internet will improve the living standards of the citizens. There is a huge trade-off as well. Having billion of people, devices and things connected to the network will cause network surge. It is projected that currently fifteen billion devices connected to the internet, while it is expected that by 2020 the number will increase to 50 billion.

The multiplex assembly of devices connected to the network will gather data from all kinds of mobile devices, smartphones, and sensors making a hefty data stream. As the network is expanding day by day, consequently increasing the volume of data. Moreover, the data produced needs to be arranged, processed, secured and analyzed. Therefore, new processing technologies must be introduced at the network end so that the network providers operate in an intelligent way by managing the huge data in a distributed fashion across the region. As a result, the network can smartly analyze and elaborate the data and control processes through informing the analyzed information, connect people and things efficiently. These data processing techniques should be able to process data instantly and ensure precise result so that cities can establish a sustainable social, economical and environmental sustainability.

Smart cities should introduce new standards in order to ensure social and economic viability. They should provide a flexible and secure platform where data should be intelligently analyzed at the network edge and efficiently communicated to the cloud. These standards should be made in such a way that they define an easy way to control and manage the smart city ecosystem through handy set-up procedure and appropriate automation.

WHAT IS FOG COMPUTING?

Fog computing is a technology introduced in order to disburden the centralized cloud. It can be referred as edge computing as it operates on the network edge, unlike cloud computing which hosts and works from a centralized cloud. In fog computing, data is processed in smart devices locally without being sent to the centralized cloud for processing.  It is considered as one of the best technology in the deployment of Internet of Things (IoT). The network architecture of fog computing is shown in the figure below.

Figure 1: an illustration of architecture of Fog.

In 2015, Cisco along with the City of Barcelona and certain other partners conducted a Proof of Concept (PoC) on fog computing. The main purpose of this PoC was to realize the vision of fog computing. Fog computing is useful when dealing with real-time applications in a limited spectrum. Having millions of devices connected to the network, the volume of data generated each second is more than petabytes and exabytes. This large volume of data being generated each instant cannot be sent to the cloud as there is always limited bandwidth provided by the network. Under such situations fog computing is an excellent solution. Rather than setting up channels to the cloud for the purpose of processing and utilization, it resources, analyzes and aggregates the data at network edge thereby reducing the requirement for additional bandwidth. Through this distributed strategy it helps efficiently utilize the allocated bandwidth and lower network maintenance costs.

FOG COMPUTING CHARACTERISTICS AND USES

Fog computing has several advantages but some of the noteworthy attributes are as following:

While talking about smart cities, fog computing enhances the abilities of IoT and Cloud Computing by aggregating the data before sending to the cloud for further computation. As the information from the sensors keeps on increasing at an exponential rate, the chance of potential bottleneck becomes more and more evitable. Moreover, due to the flow of a large number of data to the cloud may also incapacitate the real-time communication applications. Therefore, fog computing provides the most feasible and best platform for critical IoT applications such as smart grid, connected vehicle, machine to machine communication, smart cities, and some IoT services like Wireless sensors and Actuator Networks (WSANs). Analysis of fog and cloud computing is shown in the figure given below.

fog communications

Figure 2: Analysis of fog and cloud computing.

In Chicago traffic light system is controlled with the help of smart sensors. For instance, it’s Tuesday morning, a day of big parade on the celebration of Chicago Cubs’ first World Series championship in more than a century. Big traffic is expected to enter the city as a large number of a crowd will be visiting to celebrate the victory of their team. With the increase in traffic, traffic volume is controlled through the smart traffic sensors and data are collected from the individual traffic lights.

The application developed by the IOT specialists functions by automatically adjusting the on and off patterns of the traffic light in real-time at the network edge by monitoring the volume of traffic as it becomes large or shrinks. Therefore, the visitors spend less time on roads and more time at the celebration.

FUTURE OF FOG COMPUTING FOR SMART CITIES

It is being suggested by Cisco that by utilizing both cloud and fog computing IoT services can be utilized more smartly than ever before. Cisco says that by combining both its Java Virtual Machine and IOS Linux platforms, it will be easy to port the applications to a supportable environment. Moreover, many businesses are adopting both fog and cloud computing. It is worth mentioning that after Cisco, Microsoft has also introduced Windows 10 IoT core as an optimal operating system for devices which lack screen and thus giving the utility to use Windows for smart devices. It is estimated by IDC that the volume of data being stored, analyzed and processed on devices at the edge will be 40 percent in upcoming years. Moreover, 50 percent of the IoT use cases will be controlled by the network.

In recent years, fog computing has become a necessary component for IoT architecture. It manages the services and controls the data stream from the network edge. This particular edge computing can thus reduce the CAPEX and OPEX as well as saves the time of deployment of smart city solutions. There are many verticals of smart cities where fog computing can be used such as smart health, smart grid, smart transportation, retail, industries and many others.

To further know about Applications for Smart Cities, Challenges, and Smart Cities Technologies, you can join our Smart Cities Essential Courses here.

Towards Security issues in Smart Homes?

telxperts telecom trainings

The Internet-of-Things (IoT) and the idea of the “smart home” have grown dramatically in the last few years. Smart home technology has entirely changed our lifestyle in such a way that everything from the fridge to furnace everything is now interconnected. Everyone around the world is now embracing smart cities applications, yet the matter of security of personal data has become more important than ever. Recent reports about the risks of smart cities applications have revealed the security breaches of smart home security. It was revealed in 2011 that fitness tracker giant Fitbit found their profiles were public and accessible on Google. Smart home devices are prone to data breaching as well. It was reported in a recent article that Comcast’s Xfinity Home Security had some flaws. It showed that the windows and doors are secured when in fact they were unlocked and open. Tod Beardsley of Rapid7, in response to these security weaknesses told CSO that, “IoT devices tend to be designed with a happy path in mind, often don’t consider an active adversary.”  In this article, we have discussed potential cyber security challenges for IoT devices based smart homes along with some possible innovative solutions.

telxperts telecom trainings

What are the key security challenges in Smart Homes?

It is evident that each week there are new challenges reported about the IoT Security, fuelled by reports of homes suffering from ‘digital break-ins’ and family members intimidated by the perpetrators. Smart home security is the top priority of every individual. There are several potential smart home security solutions available that promise to keep smart home inhabitants safe with some intelligent supervision at home. Smart homes can assist its occupants to not only monitor their home remotely but also to ensure to safety of their children at home. However, deployment of IoT smart home solutions may cause potential security issues to our personal data and privacy at home. Many app-controlled smart home devices relay information about the home to external servers for processing. By storing information in the cloud, these solutions offer convenient control for the home. An overview of potential security vulnerabilities in various IoT-based devices such as smart plugs deployed in a smart home is shown in figure 2 below. However, homeowners who opt for several smart home devices may need a separate app for each product. As a result, there might be a greater chance of data breaches.

A case study of security drawbacks in Smart Homes

In this section, a case study of smart home security camera issue is presented. Specifically, network connected cameras have many security vulnerabilities because they don’t encrypt data and have weak policies regarding passwords.  It was in an article by Infosec Institute. Although devices like webcam and hidden cameras are subject to give extra security assurance about one’s home. However, there could be some drawbacks of IP based connected cameras connected to smart homes. Specifically, it can allow hackers to access into IP base camera recordings. For the purpose of safety it is recommended to search about the history of security of the camera manufacturers so that you will come to know about the level of encryption used in the device.  It also recommended to use secure Wi-Fi and strong passwords to optimize security. About 73,000 security cameras were hacked in year 2014 as a result of the unchanged default password.  Weak passwords are more prone to cyber attacks and should be regarded seriously. Even though users cannot have the right to control device’s security infrastructure, users do have the power to avoid public networks and set highly secure passwords. Lack of security infrastructure in the manufacturing of smart home devices is the most evident cause of vulnerability of smart devices. A recent article by Business Insider asserted that IoT products lack particular industry security standards for their manufacturer.

Future Work for security in Smart Homes:

Although IoT has made our lives easy yet our personal data is more at risk. We are now responsible for our own protection against the weakly secured IoT devices. Since smart home devices purely rely on remote access and cloud servers, security of customer data in the cloud server and on the devices has become an inevitable challenge. Despite the fact that connected home products guarantee consumer’s convenience, these devices may have some serious loop holes.  Security needs should be included at an early stage of product design, and implemented in every aspect of the system.

To know more about Smart Security Challenges, and Smart Cities Technologies, you can join our Smart Cities Essential Courses here.

A Framework for Push-to-Talk Service Implementation using Voice over LTE (VoLTE) and its Key Features

With the recent advancement in wireless technology and enhanced smartphone capabilities, there has been an apparent increase in the utilisation of Push-To-Talk over Cellular (PTToC).  PTToC is a feature which can only be implemented when Voice-over-Long Term Evolution (VoLTE) capability is installed in the cellular networks. PTToChas wide coverage since it uses a cellular network. Moreover, PTToC is implemented as a data call. Therefore, calls can be made even when voice channels are congested. A pictorial representation of PTToC is shown in figure 1 below.

piti

Figure 1: An illustration of PTToC using various cellular networks.

The Role of PTToC in Mission Critical Communications

With the advancement in telecommunications, the PTToC utility is currently being used in many commercial applications, for example, Public Safety networks and business critical sectors. This technology works in harmony with Mission-Critical applications offered by the Land Mobile Radio (LMR) networks. Today’s cutting edge communication techniques have enabled organisations to combine traditional voice communication with growing data sources in order to make better and faster decisions. PTToC is an essential component of LTE-based solutions. Moreover, encrypted Push-to-Talk over cellular offers enhanced features such as location tracking, group texting and image messaging for real-time communications.

Key PTToC Capabilities

PTToC offers multiple benefits for the agencies and organisations in commercial and business sectors by giving mission critical services. A brief description of the PTToC salient features is given below.

Instant Global Connectivity

Since PTToC is operating on modern digital cellular network standards, the entire world has now same network infrastructure. Public safety agencies can now connect any where around the globe in the same way as it would connect with the terrestrial-based trunked radio systems.

Fast and secure public safety networks

PTToChas connection speeds equivalent to land-mobile radio communication. Thus,byintegrating the benefits of PTToC and LMR, users can be facilitated with seamless mobile communication. With the introduction of encrypted PTToC, organisations are able to increase the number of users, in order to make them capable of communicating over a DMR network in a secure way.

Flexibility in critical communication services

This technology is well suited to a wide-range of devices. The user can use this technology with their choice of device. PTToC is very handy so that field workers can use it easily while working in remote areas and harsh environments. Group call in PTToC enables users to exchange information simultaneously with multiple users, and by this means all the group members are fully aware of the situation.

In summary, the rise in the usage off PTToC is due to the speed offered by 3G and 4G mobile networks and the introduction of smartphones. With the introduction of new applications for PTToC technology, it seems that PTToC communications will be an essential component of emerging LTE-based solutions.

To further know about implementation of VoLTE, Technologies, Challenges, and RCS for Public Service, you can join our  Voice over LTE (VoLTE) and RCS Certification Courses here.

How Telecom Operators Can Help Industries to design innovative IoT Applications for Smart Cities?

Internet-of-Things(IoT), a technology which connects objects to the internet. Cisco has anticipated that by 2020, IoT will be consisting of 50 billion devices connected to the internet. As a result, there will be 6 devices per person. It is expected that IoT will fully transform our economy, society and standards of living.  Businesses and enterprises strive hard to bring products to the markets more frequently, adapt themselves to regulatory requirements, and most importantly business leaders tend to innovate persistently. With a large number of mobile staff, growing customers and changing supply chain demand, IoT can help businesses and entrepreneurs to generate large revenue in business. Figure 1 below is depicting the connection of IoT devices to the cloud server in 5G.

tiot

Figure 1: IoT devices connected to cloud server using a 5G cellular network. [Courtesy of intel].

Only those companies will be able to maintain their position among their competitors that not only embrace IoT but also use it to transform their business. By integrating IoT into business operations, products and customer interactions, business leaders can build new business models and foundation of values. McKinsey estimates that by 2025, businesses will be able to generate revenue of approximately $11 trillion per year through incorporating IoT applications and products.

Businesses are now transforming their processes, operations and business models to benefit from the latest technologies. Smart cities, connected utilities, smart transportation, connected factories, smart health, smart grid and connected miles are few names of the result of this evolution. All industries are considering IoT a breakthrough technology in order to help them optimize their business, enter new markets and build a good customer relationship. Many industry experts, like IDC, estimates that businesses will spend over $20 trillion in the next four years to realize the potential of IoT. An illustration of IoT deployed in smart city verticals.

tiot2

Figure 2: An illustration of Internet-of-Things applications Smart Cities.

The revolution of IoT may have begun but it isn’t implemented to the full scale yet. There is still plenty of time before its transformational powers will be fully felt. A number of technical, economical, and regulatory perceptions still need to be addressed. There are some companies out there who need to do something but are not sure about how. According to Harvard Business Review and Verizon statistics, it is concluded that less than 10 percent of companies have deployed IoT initiatives. Furthermore, only a small minority which is 56 percent have the proper strategy for IoT.

According to recent research conducted by Cisco about ICT companies and decision makers concluded that the top three challenges for implementing IoT initiatives in their business were: (i) Data Privacy; (ii) Standardization of IoT protocols and intra-operability among different business systems; and (iii) Design cost.

IoT supplier market is currently very fragmented with a massive amount of big and small companies providing single pieces of IoT implementation devices, applications, and solutions. As a result, making it more challenging for the companies to meet the demands of customer needs.

To further know about Application of IOT and SON for Smart Cities, Technologies, and Challenges, you can join our Smart Cities Essential Courses here.