Jie REN Minglin LIU Lisheng LI Shuai LI Mu FANG Wenbin LIU Yang LIU Haidong YU Shidong ZHANG
The distribution station serves as a foundational component for managing the power system. However, there are missing data in the areas without collection devices due to the limitation of device deployment, leading to an adverse impact on the real-time and precise monitoring of distribution stations. The problem of missing data can be solved by the pseudo measurement data deduction method. Traditional pseudo measurement data deduction methods overlook the temporal and contextual correlations of distribution station data, resulting in a lower restoration accuracy. Motivated by the above challenges, this paper proposes a novel pseudo measurement data deduction model for minimal data collection requirements in distribution stations. Compared to the traditional GAN, the proposed enhanced GAN improves the architecture by decomposing the input tensor of the generator, allowing it to handle high-dimensional and intricate data. Furthermore, we enhance the loss function to accelerate the model’s convergence speed. Our proposed approach allows GAN to be trained within a supervised environment, effectively enhancing the accuracy of model training. The simulation result shows that the proposed algorithm achieves better performances compared with existing methods.
Takahiro SASAKI Yukihiro KAMIYA
This paper proposes two VLSI implementation approaches for periods estimation hardware of periodic signals. Digital signal processing is one of the important technologies, and to estimate periods of signals are widely used in many areas such as IoT, predictive maintenance, anomaly detection, health monitoring, and so on. This paper focuses on accumulation for real-time serial-to-parallel converter (ARS) which is a simple parameter estimation method for periodic signals. ARS is simple algorithm to estimate periods of periodic signals without complex instructions such as multiplier and division. However, this algorithm is implemented only on software, suitable hardware implementation methods are not clear. Therefore, this paper proposes two VLSI implementation methods called ARS-DFF and ARS-MEM. ARS-DFF is simple and fast implementation method, but hardware scale is large. ARS-MEM reduces hardware scale by introducing an SRAM macro cell. This paper also designs both approaches using SystemVerilog and evaluates VLSI implementation. According to our evaluation results, both proposed methods can reduce the power consumption to less than 1/1000 compared to the implementation on a microprocessor.
Archana K. RAJAN Masaki BANDAI
Constrained Application Protocol (CoAP) is a popular UDP based data transfer protocol designed for constrained nodes in lossy networks. Congestion control method is essentially required in such environment to ensure proper data transfer. CoAP offers a default congestion control technique based on binary exponential backoff (BEB) where the default method calculates retransmission time out (RTO) irrespective of Round Trip Time (RTT). CoAP simple congestion control/advanced (COCOA) is a standard alternative algorithm. COCOA computes RTO using exponential weighted moving average (EWMA) based on type of RTT; strong RTT or weak RTT. The constant weight values used in COCOA delays the variability of RTO. This delay in converging the RTO can cause QoS problems in many IoT applications. In this paper, we propose a new method called Flexi COCOA to accomplish a flexible weight based RTO computation for strong RTT samples. We show that Flexi COCOA is more network sensitive than COCOA. Specifically, the variable weight focuses on deriving better RTO and utilizing the resources more. Flexi COCOA is implemented and validated against COCOA using the Cooja simulator in Contiki OS. We carried out extensive simulations using different topologies, packet sending rates and packet error rates. Our results show that Flexi COCOA outshines COCOA and can improve QoS of IoT monitoring applications.
Hiroshi YAMAMOTO Keigo SHIMATANI Keigo UCHIYAMA
In order to support a person’s various activities (e.g., primary industry, elderly care), an IoT (Internet of Things) system supporting sensing technologies that observe the status of various people, things, and places in the real world is attracting attention. The existing studies have utilized the camera device for the sensing technology to observe the condition of the real world, but the use of the camera device has problems related to the limitation of the observation period and the invasion of privacy. Therefore, new IoT systems utilizing three-dimensional LiDAR (Light Detection And Ranging) have been proposed because they can obtain three-dimensional spatial information about the real world by solving problems. However, several problems exist with the use of 3D LiDAR in the deployment on the real fields. The 3D LiDAR requires much electric power for observing the three-dimensional spatial information. In addition, the annotation process of a large volume of point-cloud data for constructing a machine-learning model requires significant time and effort. Therefore, in this study, we propose IoT systems utilizing 3D LiDAR for observing the status of targets and equipping them with new technologies to improve the practicality of the use of 3D LiDAR. First, a linkage function is designed to achieve power saving for an entire system. The linkage function operates only a sensing technology with low power consumption during normal operation and activates the 3D LiDAR with high power consumption only when it is estimated that an observation target is approaching. Second, a self-learning function is built to analyze the data collected by not only the 3D LiDAR but also the camera for automatically generating a large amount of training data with the correct label which is estimated by analyzing the camera images. Through the experimental evaluations using a prototype system, it is confirmed that the sensing technologies can correctly be interconnected to reduce the total power consumption. In addition, the machine-learning model constructed by the self-learning function can accurately estimate the status of the targets.
Kaiji OWAKI Yusuke KANDA Hideaki KIMURA
In recent years, the declining birthrate and aging population have become serious problems in Japan. To solve these problems, we have developed a system based on edge AI. This system predicts the future heart rate during walking in real time and provides feedback to improve the quality of exercise and extend healthy life expectancy. In this paper, we predicted the heart rate in real time based on the proposed system and provided feedback. Experiments were conducted without and with the predicted heart rate, and a comparison was made to demonstrate the effectiveness of the predicted heart rate.
Yoshinori ITOTAGAWA Koma ATSUMI Hikaru SEBE Daisuke KANEMOTO Tetsuya HIROSE
This paper describes a programmable differential bandgap reference (PD-BGR) for ultra-low-power IoT (Internet-of-Things) edge node devices. The PD-BGR consists of a current generator (CG) and differential voltage generator (DVG). The CG is based on a bandgap reference (BGR) and generates an operating current and a voltage, while the DVG generates another voltage from the current. A differential voltage reference can be obtained by taking the voltage difference from the voltages. The PD-BGR can produce a programmable differential output voltage by changing the multipliers of MOSFETs in a differential pair and resistance with digital codes. Simulation results showed that the proposed PD-BGR can generate 25- to 200-mV reference voltages with a 25-mV step within a ±0.7% temperature inaccuracy in a temperature range from -20 to 100°C. A Monte Carlo simulation showed that the coefficient of the variation in the reference was within 1.1%. Measurement results demonstrated that our prototype chips can generate stable programmable differential output voltages, almost the same results as those of the simulation. The average power consumption was only 88.4 nW, with a voltage error of -4/+3 mV with 5 samples.
Feng WANG Xiangyu WEN Lisheng LI Yan WEN Shidong ZHANG Yang LIU
The rapid advancement of cloud-edge-end collaboration offers a feasible solution to realize low-delay and low-energy-consumption data processing for internet of things (IoT)-based smart distribution grid. The major concern of cloud-edge-end collaboration lies on resource management. However, the joint optimization of heterogeneous resources involves multiple timescales, and the optimization decisions of different timescales are intertwined. In addition, burst electromagnetic interference will affect the channel environment of the distribution grid, leading to inaccuracies in optimization decisions, which can result in negative influences such as slow convergence and strong fluctuations. Hence, we propose a cloud-edge-end collaborative multi-timescale multi-service resource management algorithm. Large-timescale device scheduling is optimized by sliding window pricing matching, which enables accurate matching estimation and effective conflict elimination. Small-timescale compression level selection and power control are jointly optimized by disturbance-robust upper confidence bound (UCB), which perceives the presence of electromagnetic interference and adjusts exploration tendency for convergence improvement. Simulation outcomes illustrate the excellent performance of the proposed algorithm.
Youquan XIAN Lianghaojie ZHOU Jianyong JIANG Boyi WANG Hao HUO Peng LIU
In recent years, blockchain has been widely applied in the Internet of Things (IoT). Blockchain oracle, as a bridge for data communication between blockchain and off-chain, has also received significant attention. However, the numerous and heterogeneous devices in the IoT pose great challenges to the efficiency and security of data acquisition for oracles. We find that the matching relationship between data sources and oracle nodes greatly affects the efficiency and service quality of the entire oracle system. To address these issues, this paper proposes a distributed and efficient oracle solution tailored for the IoT, enabling fast acquisition of real-time off-chain data. Specifically, we first design a distributed oracle architecture that combines both Trusted Execution Environment (TEE) devices and ordinary devices to improve system scalability, considering the heterogeneity of IoT devices. Secondly, based on the trusted node information provided by TEE, we determine the matching relationship between nodes and data sources, assigning appropriate nodes for tasks to enhance system efficiency. Through simulation experiments, our proposed solution has been shown to effectively improve the efficiency and service quality of the system, reducing the average response time by approximately 9.92% compared to conventional approaches.
Displacement current is the last piece of the puzzle of electromagnetic theory. Its existence implies that electromagnetic disturbance can propagate at the speed of light and finally it led to the discovery of Hertzian waves. On the other hand, since magnetic fields can be calculated only with conduction currents using Biot-Savart's law, a popular belief that displacement current does not produce magnetic fields has started to circulate. But some people think if this is correct, what is the displacement current introduced for. The controversy over the meaning of displacement currents has been going on for more than hundred years. Such confusion is caused by forgetting the fact that in the case of non-stationary currents, neither magnetic fields created by conduction currents nor those created by displacement currents can be defined. It is also forgotten that the effect of displacement current is automatically incorporated in the magnetic field calculated by Biot-Savart's law. In this paper, mainly with the help of Helmholtz decomposition, we would like to clarify the confusion surrounding displacement currents and provide an opportunity to end the long standing controversy.
Yuhei YAMAMOTO Naoki SHIBATA Tokiyoshi MATSUDA Hidenori KAWANISHI Mutsumi KIMURA
Thermoelectric effect of Ga-Sn-O (GTO) thin films has been investigated for Internet-of-Things application. It is found that the amorphous GTO thin films provide higher power factors (PF) than the polycrystalline ones, which is because grain boundaries block the electron conduction in the polycrystalline ones. It is also found that the GTO thin films annealed in vacuum provide higher PF than those annealed in air, which is because oxygen vacancies are terminated in those annealed in air. The PF and dimensionless figure of merit (ZT) is not so excellent, but the cost effectiveness is excellent, which is the most important for some examples of the Internet-of-Things application.
Arif DATAESATU Kosuke SANADA Hiroyuki HATANO Kazuo MORI Pisit BOONSRIMUANG
The fifth-generation (5G) new radio (NR) standard employs ultra-reliable and low-latency communication (URLLC) to provide real-time wireless interactive capability for the internet of things (IoT) applications. To satisfy the stringent latency and reliability demands of URLLC services, grant-free (GF) transmissions with the K-repetition transmission (K-Rep) have been introduced. However, fading fluctuations can negatively impact signal quality at the base station (BS), leading to an increase in the number of repetitions and raising concerns about interference and energy consumption for IoT user equipment (UE). To overcome these challenges, this paper proposes novel adaptive K-Rep control schemes that employ site diversity reception to enhance signal quality and reduce energy consumption. The performance evaluation demonstrates that the proposed adaptive K-Rep control schemes significantly improve communication reliability and reduce transmission energy consumption compared with the conventional K-Rep scheme, and then satisfy the URLLC requirements while reducing energy consumption.
Yanming CHEN Bin LYU Zhen YANG Fei LI
In this paper, we investigate a wireless-powered relays assisted batteryless IoT network based on the non-linear energy harvesting model, where there exists an energy service provider constituted by the hybrid access point (HAP) and an IoT service provider constituted by multiple clusters. The HAP provides energy signals to the batteryless devices for information backscattering and the wireless-powered relays for energy harvesting. The relays are deployed to assist the batteryless devices with the information transmission to the HAP by using the harvested energy. To model the energy interactions between the energy service provider and IoT service provider, we propose a Stackelberg game based framework. We aim to maximize the respective utility values of the two providers. Since the utility maximization problem of the IoT service provider is non-convex, we employ the fractional programming theory and propose a block coordinate descent (BCD) based algorithm with successive convex approximation (SCA) and semi-definite relaxation (SDR) techniques to solve it. Numerical simulation results confirm that compared to the benchmark schemes, our proposed scheme can achieve larger utility values for both the energy service provider and IoT service provider.
Non-Terrestrial-Network (NTN) can provide seamless and ubiquitous connectivity of massive devices. Thus, the feeder links between satellites and gateways need to provide essentially high data transmission rates. In this paper, we focus on a typical high-capacity scenario, i.e., LEO-IoT, to find an optimal satellite selection schema to maximize the capacity of feeder links. The proposed schema is able to obtain the optimal mapping among all the satellites and gateways. By comparing with maximum service time algorithm, the proposed schema can construct a more balanced and reasonable connection pattern to improve the efficiency of the gateways. Such an advantage will become more significant as the number of satellites increases.
Akio KAWABATA Takuya TOJO Bijoy CHAND CHATTERJEE Eiji OKI
Mission-critical monitoring services, such as finding criminals with a monitoring camera, require rapid detection of newly updated data, where suppressing delay is desirable. Taking this direction, this paper proposes a network design scheme to minimize this delay for monitoring services that consist of Internet-of-Things (IoT) devices located at terminal endpoints (TEs), databases (DB), and applications (APLs). The proposed scheme determines the allocation of DB and APLs and the selection of the server to which TE belongs. DB and APL are allocated on an optimal server from multiple servers in the network. We formulate the proposed network design scheme as an integer linear programming problem. The delay reduction effect of the proposed scheme is evaluated under two network topologies and a monitoring camera system network. In the two network topologies, the delays of the proposed scheme are 78 and 80 percent, compared to that of the conventional scheme. In the monitoring camera system network, the delay of the proposed scheme is 77 percent compared to that of the conventional scheme. These results indicate that the proposed scheme reduces the delay compared to the conventional scheme where APLs are located near TEs. The computation time of the proposed scheme is acceptable for the design phase before the service is launched. The proposed scheme can contribute to a network design that detects newly added objects quickly in the monitoring services.
Yuki ABE Kazutoshi KOBAYASHI Jun SHIOMI Hiroyuki OCHI
Energy harvesting has been widely investigated as a potential solution to supply power for Internet of Things (IoT) devices. Computing devices must operate intermittently rather than continuously, because harvested energy is unstable and some of IoT applications can be periodic. Therefore, processors for IoT devices with intermittent operation must feature a hibernation mode with zero-standby-power in addition to energy-efficient normal mode. In this paper, we describe the layout design and measurement results of a nonvolatile standard cell memory (NV-SCM) and nonvolatile flip-flops (NV-FF) with a nonvolatile memory using Fishbone-in-Cage Capacitor (FiCC) suitable for IoT processors with intermittent operations. They can be fabricated in any conventional CMOS process without any additional mask. NV-SCM and NV-FF are fabricated in a 180nm CMOS process technology. The area overhead by nonvolatility of a bit cell are 74% in NV-SCM and 29% in NV-FF, respectively. We confirmed full functionality of the NV-SCM and NV-FF. The nonvolatile system using proposed NV-SCM and NV-FF can reduce the energy consumption by 24.3% compared to the volatile system when hibernation/normal operation time ratio is 500 as shown in the simulation.
Koji NAKAO Katsunari YOSHIOKA Takayuki SASAKI Rui TANABE Xuping HUANG Takeshi TAKAHASHI Akira FUJITA Jun'ichi TAKEUCHI Noboru MURATA Junji SHIKATA Kazuki IWAMOTO Kazuki TAKADA Yuki ISHIDA Masaru TAKEUCHI Naoto YANAI
In this paper, we developed the latest IoT honeypots to capture IoT malware currently on the loose, analyzed IoT malware with new features such as persistent infection, developed malware removal methods to be provided to IoT device users. Furthermore, as attack behaviors using IoT devices become more diverse and sophisticated every year, we conducted research related to various factors involved in understanding the overall picture of attack behaviors from the perspective of incident responders. As the final stage of countermeasures, we also conducted research and development of IoT malware disabling technology to stop only IoT malware activities in IoT devices and IoT system disabling technology to remotely control (including stopping) IoT devices themselves.
Veeramani KARTHIKA Suresh JAGANATHAN
Considering the growth of the IoT network, there is a demand for a decentralized solution. Incorporating the blockchain technology will eliminate the challenges faced in centralized solutions, such as i) high infrastructure, ii) maintenance cost, iii) lack of transparency, iv) privacy, and v) data tampering. Blockchain-based IoT network allows businesses to access and share the IoT data within their organization without a central authority. Data in the blockchain are stored as blocks, which should be validated and added to the chain, for this consensus mechanism plays a significant role. However, existing methods are not designed for IoT applications and lack features like i) decentralization, ii) scalability, iii) throughput, iv) faster convergence, and v) network overhead. Moreover, current blockchain frameworks failed to support resource-constrained IoT applications. In this paper, we proposed a new consensus method (WoG) and a lightweight blockchain framework (iLEDGER), mainly for resource-constrained IoT applications in a permissioned environment. The proposed work is tested in an application that tracks the assets using IoT devices (Raspberry Pi 4 and RFID). Furthermore, the proposed consensus method is analyzed against benign failures, and performance parameters such as CPU usage, memory usage, throughput, transaction execution time, and block generation time are compared with state-of-the-art methods.
Kosuke MURAKAMI Takahiro KASAMA Daisuke INOUE
Since the outbreak of IoT malware “Mirai,” several incidents have occurred in which IoT devices have been infected with malware. The malware targets IoT devices whose Telnet and SSH services are accessible from the Internet and whose ID/Password settings are not strong enough. Several IoT malware families, including Mirai, are also known that restrict access to Telnet and other services to keep the devices from being infected by other malware after infection. However, tens of thousands of devices in Japan can be still accessed Telnet services over the Internet according to network scan results. Does this imply that these devices can avoid malware infection by setting strong enough passwords, and thus cannot be used as a stepping stone for cyber attacks? In February 2019, we initiated the National Operation Toward IoT Clean Environment (NOTICE) project in Japan to investigate IoT devices with weak credentials and notify the device users. In this study, we analyze the results of the NOTICE project from February 2021 to May 2021 and the results of the large-scale darknet monitoring to reveal whether IoT devices with weak credentials are infected with malware or not. Moreover, we analyze the IoT devices with weak credentials to find out the factors that prevent these devices from being infected with malware and to assess the risk of abuse for cyber attacks. From the results of the analysis, it is discovered that approximately 2,000 devices can be easily logged in using weak credentials in one month in Japan. We also clarify that no device are infected with Mirai and its variants malware due to lack of functions used for malware infection excluding only one host. Finally, even the devices which are logged in by NOTICE project are not infected with Mirai, we find that at least 80% and 93% of the devices can execute arbitrary scripts and can send packets to arbitrary destinations respectively.
With the support of emerging technologies such as 5G, machine learning, edge computing and Industry 4.0, the Internet of Things (IoT) continues to evolve and promote the construction of future networks. Existing work on IoT mainly focuses on its practical applications, but there is little research on modeling the interactions among components in IoT systems and verifying the correctness of the network deployment. Therefore, the Calculus of the Internet of Things (CaIT) has previously been proposed to formally model and reason about IoT systems. In this paper, the CaIT calculus is extended by introducing broadcast communications. For modeling convenience, we provide explicit operations to model node mobility as well as the interactions between sensors (or actuators) with the environment. To support the use of UPPAAL to verify the temporal properties of IoT networks described by the CaIT calculus, we establish a relationship between timed automata and the CaIT calculus. Using UPPAAL, we verify six temporal properties of a simple “smart home” example, including Boiler On Manually, Boiler Off Automatically, Boiler On Automatically, Lights On, Lights Mutually, and Windows Simultaneously. The verification results show that the “smart home” can work properly.
Ryouichi NISHIMURA Byeongpyo JEONG Hajime SUSUKITA Takashi TAKAHASHI Kenichi TAKIZAWA
The degree of reception of BS signals is affected by various factors. After routinely recording it at two observation points at two locations, we found that momentary upward and downward level shifts occurred multiple times, mainly during daytime. These level shifts were observed at one location. No such signal was sensed at the other location. After producing an algorithm to extract such momemtary level shifts, their statistical properties were investigated. Careful analyses, including assessment of the signal polarity, amplitude, duration, hours, and comparison with actual flight schedules and route information implied that these level shifts are attributable to the interference of direct and reflected waves from aircraft flying at approximately tropopause altitude. This assumption is further validated through computer simulations of BS signal interference.