1-6hit |
Weijie CHEN Ming CAI Xiaojun TAN Bo WEI
Accurate estimation of the state-of-charge is a crucial need for the battery, which is the most important power source in electric vehicles. To achieve better estimation result, an accurate battery model with optimum parameters is required. In this paper, a gradient-free optimization technique, namely tree seed algorithm (TSA), is utilized to identify specific parameters of the battery model. In order to strengthen the search ability of TSA and obtain more quality results, the original algorithm is improved. On one hand, the DE/rand/2/bin mechanism is employed to maintain the colony diversity, by generating mutant individuals in each time step. On the other hand, the control parameter in the algorithm is adaptively updated during the searching process, to achieve a better balance between the exploitation and exploration capabilities. The battery state-of-charge can be estimated simultaneously by regarding it as one of the parameters. Experiments under different dynamic profiles show that the proposed method can provide reliable and accurate estimation results. The performance of conventional algorithms, such as genetic algorithm and extended Kalman filter, are also compared to demonstrate the superiority of the proposed method in terms of accuracy and robustness.
Kenji KANAI Bo WEI Zhengxue CHENG Masaru TAKEUCHI Jiro KATTO
This paper introduces recent trends in video streaming and four methods proposed by the authors for video streaming. Video traffic dominates the Internet as seen in current trends, and new visual contents such as UHD and 360-degree movies are being delivered. MPEG-DASH has become popular for adaptive video streaming, and machine learning techniques are being introduced in several parts of video streaming. Along with these research trends, the authors also tried four methods: route navigation, throughput prediction, image quality assessment, and perceptual video streaming. These methods contribute to improving QoS/QoE performance and reducing power consumption and storage size.
Wataru KAWAKAMI Kenji KANAI Bo WEI Jiro KATTO
To recognize transportation modes without any additional sensor devices, we demonstrate that the transportation modes can be recognized from communication quality factors. In the demonstration, instead of using global positioning system (GPS) and accelerometer sensors, we collect mobile TCP throughputs, received-signal strength indicators (RSSIs), and cellular base-station IDs (Cell IDs) through in-line network measurement when the user enjoys mobile services, such as video streaming. In accuracy evaluations, we conduct two different field experiments to collect the data in six typical transportation modes (static, walking, riding a bicycle, riding a bus, riding a train and riding a subway), and then construct the classifiers by applying a support-vector machine (SVM), k-nearest neighbor (k-NN), random forest (RF), and convolutional neural network (CNN). Our results show that these transportation modes can be recognized with high accuracy by using communication quality factors as well as the use of accelerometer sensors.
Fengyuan REN Chuang LIN Bo WEI
Available Bit Rate (ABR) flow control is an effective measure in ATM network congestion control. In large scale and high-speed network, the simplicity of algorithm is crucial to optimize the switch performance. Although the binary flow control is very simple, the queue length and allowed cell rate (ACR) controlled by the standard EFCI algorithm oscillate with great amplitude, which has negative impact on the performance, so its applicability was doubted, and then the explicit rate feedback mechanism was introduced and explored. In this study, the model of binary flow control is built based on the fluid flow theory, and its correctness is validated by simulation experiments. The linear model describing the source end system how to regulate the cell rate is obtained through local linearization method. Then, we evaluate and analyze the standard EFCI algorithm using the describing function approach, which is well-developed in nonlinear control theory. The conclusion is that queue and ACR oscillations are caused by the inappropriate nonlinear control rule originated from intuition, but not intrinsic attribute of the binary flow control mechanism. The simulation experiments validate our analysis and conclusion. Finally, the new scheme about parameter settings is put forward to remedy the weakness existed in the standard EFCI switches without any change on the hardware architecture. The numerical results demonstrate that the new scheme is effective and fruitful.
Bo WEI Kenji KANAI Wataru KAWAKAMI Jiro KATTO
Throughput prediction is one of the promising techniques to improve the quality of service (QoS) and quality of experience (QoE) of mobile applications. To address the problem of predicting future throughput distribution accurately during the whole session, which can exhibit large throughput fluctuations in different scenarios (especially scenarios of moving user), we propose a history-based throughput prediction method that utilizes time series analysis and machine learning techniques for mobile network communication. This method is called the Hybrid Prediction with the Autoregressive Model and Hidden Markov Model (HOAH). Different from existing methods, HOAH uses Support Vector Machine (SVM) to classify the throughput transition into two classes, and predicts the transmission control protocol (TCP) throughput by switching between the Autoregressive Model (AR Model) and the Gaussian Mixture Model-Hidden Markov Model (GMM-HMM). We conduct field experiments to evaluate the proposed method in seven different scenarios. The results show that HOAH can predict future throughput effectively and decreases the prediction error by a maximum of 55.95% compared with other methods.
Daxiu ZHANG Xianwei LI Bo WEI Yukun SHI
With the increase of the number of Mobile User Equipments (MUEs), numerous tasks that with high requirements of resources are generated. However, the MUEs have limited computational resources, computing power and storage space. In this paper, a joint coverage constrained task offloading and resource allocation method based on deep reinforcement learning is proposed. The aim is offloading the tasks that cannot be processed locally to the edge servers to alleviate the conflict between the resource constraints of MUEs and the high performance task processing. The studied problem considers the dynamic variability and complexity of the system model, coverage, offloading decisions, communication relationships and resource constraints. An entropy weight method is used to optimize the resource allocation process and balance the energy consumption and execution time. The results of the study show that the number of tasks and MUEs affects the execution time and energy consumption of the task offloading and resource allocation processes in the interest of the service provider, and enhances the user experience.