Loading...
  • 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Current Issue

    • 论文
      An improved simulated annealing algorithm for
      program optimization parameters search   
      LU Pingjing,LI Bao,YI RenJiao,ZHANG Ying,Wang Shaogang,PANG Zhengbin
      2015, 37(07): 1227-1232. doi:
      Abstract ( 390 )   PDF (751KB) ( 98389 )     

      High level program transformations are critical to improve applications’ performance,many of which mainly concern the determination of optimal transformation parameters, such as loop blocking.Since optimization parameter search problem is featured NP-hard,to date,no one can find any deterministic algorithms to efficiently solve this problem.Considering it as a nonlinear global optimization problem,we introduce an improved simulated annealing algorithm to find the optimal parameters.Several comparative experiments demonstrate the performance and effectiveness of the new method.

      An improvement of random forests algorithm based on
      comprehensive sampling without replacement  
      LI Hui,LI Zheng,SHE Kun
      2015, 37(07): 1233-1238. doi:
      Abstract ( 179 )   PDF (574KB) ( 394 )     

      Data mining is an important method in big data and service computing. As a typical method in data mining, random forest is widely used due to its low error rate. In order to dealing with big data more accurately and efficiently, we make a further improvement in the accuracy and efficiency of the random forest. It demonstrates both theoretically and practically that our method can decrease the generalization error by about 12%~20% when the number we choose for replacement is beyond the number of the samples. Moreover, we replace the method of repeated sampling with a simple method, which proves equal to the method of repeated sampling. By this way, we can decrease the time of building the forest, thus promoting the efficiency by about 10%~40% when it is used alone. And this method can just make up for the efficiency loss of the first improvement. Combing the two aforementioned methods, we promote the efficiency of the unbalanced data by 10%, and improve the accuracy of the balanced data over 12% without any impact on the efficiency. Therefore, the proposed method is more suitable for big data analysis and processing in service computing than the original method.  

      An iterative linear setection algorithm based on
      componentlevel soft interference cancellation   
      LI Xiuli1,ZHANG Yinong1,ZHOU Jin1,DAI Xiaoming2
      2015, 37(07): 1239-1244. doi:
      Abstract ( 137 )   PDF (1452KB) ( 241 )     

      We propose an iterative detection and decoding (IDD) algorithm based on componentlevel soft cancellation (CLSC) and linear minimum meansquared error (LMMSE) filtering for turbo coded multiple input multiple output (MIMO) multiplexing systems in this work.By exploiting the independence between the inphase and quadraturephase branch,more accurate soft interference cancellation is realized.Extrinsic information transfer (EXIT) characteristics of the IDD system shows that the proposal has faster convergence speed than the conventional one.Numerical results further validate that the CLSCLMMSE algorithm achieves a noticeable bit error rate (BER) performance than symbollevel soft cancelation (SLSC) based LMMSE,and maintains its low complexity characteristic.

      A user dual clustering recommendation
      algorithm  based on cloud model  
      CHEN Pinghua,CHEN Chuanyu
      2015, 37(07): 1245-1251. doi:
      Abstract ( 104 )   PDF (526KB) ( 244 )     

      Collaborative filtering is a widely used recommendation algorithm, but problems such as low efficiency and data sparseness still exist. In order to solve these problems, we present an improved clustering recommendation algorithm. The algorithm introduces a cloud model, in which the expectation, entropy and hyper entropy are calculated according to the item attributes and user attributes dimensions. To build up a user interest model, the influence of rating time, rating level and rating habits are also taken into account. Then the similarities of user interests are compared by the corrected similarity measurement based on cloud model, and the Kmeans algorithm is adopted to perform clustering. Finally, the recommendation results of the public projects are merged by using the proportion of the participants who will make predictions. Experiment results on the MovieLens show that the algorithm can not only solve the problem of low efficiency and data sparseness but also improve the accuracy of the recommendation results.

      All-digital phase-locked loop
      oriented time-to-digital converters  
      ZHANG Xiao,MA Zhuo,XIE Lunguo,YU Jinshan,YUAN Hengzhou,WANG Zhiqiang
      2015, 37(07): 1252-1257. doi:
      Abstract ( 187 )   PDF (2985KB) ( 235 )     

      Time-to-Digital Converter (TDC) is an important component of the all-digital phaselocked loop (ADPLL), which plays the role of phasefrequency detector.This paper focuses on the enhancement of alldigital TDC resolution.We present the basic structures of three categories of alldigital TDC, which are counter based TDC,gate-delay-line based TDC and subgate delay TDC.We also demonstrate their respective advantages from the aspects of resolution,dynamic range,nonlinearity and etc.Finally,we summarize and make projection on future research priorities.

      A CPU-GPU collaboration based computing parallel algorithm
      for MTF degradation of remote sensing simulation images  
      ZHAO Ruibin1,ZHAO Shenghui1,HU Xinli2
      2015, 37(07): 1258-1264. doi:
      Abstract ( 171 )   PDF (3290KB) ( 263 )     

      In order to quantitatively simulate and analyze the impact on the quality of the remote sensing system from factors such as platform jitter,electronic properties,and atmospheric attenuation,it is necessary to compute the modulate transfer function (MTF) of the remote sensing system and operate it in simulation images.However,because of the characteristics of big data in remote sensing image simulation and a number of intensive algorithms involved in the computing of MTF degradation,calculating efficiency becomes the bottleneck problem. Thus, according to the existing MTF calculation model, we analyze the general process of the MTF degradation of remote sensing simulated images and the complexity of the main steps in the algorithm. Based on this,we propose a CPUGPU collaboration based computing parallel algorithm for the MTF degradation of remote sensing simulation images. Experimental results show that the algorithm can make full use of the parallel computing capacity of GPUs and improve the computation efficiency of the MTF degradation.

      An improved access control strategy for perceptual layer of IoT  
      CAI Ting1,CHEN Changzhi1,OUYANG Kai2,ZHOU Jingli2
      2015, 37(07): 1265-1271. doi:
      Abstract ( 217 )   PDF (1968KB) ( 304 )     

      Due to the massive terminal nodes in the Internet of Things (IoT),the traditional access control strategies confront challenges in improving adaptability and efficiency.Therefore,we present an efficient access control strategy.Based on ABAC mechanism,we formalize the formal definition of ABAC,and design an access control frame for the perceptual layer of IoT,which achieves a more fine-grained authorization.Moreover,we introduce a disjunctive normal form structure by using an improved CP-ABE algorithm to replace the access tree structure in the traditional CPABE algorithm, which effectively simplifies the calculation and improves the efficiency of the system.Evaluation experiments demonstrate that the proposed access control strategy reduces the time cost of the system, and has better adaptability and superiority in the IoT environment than the traditional strategies.

      An adaptive distributed private key
      generator scheme in Ad Hoc networks 
      REN Fang
      2015, 37(07): 1272-1279. doi:
      Abstract ( 120 )   PDF (2923KB) ( 270 )     

      In order to improve the security of identity based cryptography in Ad Hoc networks, we propose an adaptive distributed private key generator (DPKG) scheme, in which the server nodes that provide DPKG service are not fixed but are dynamically chosen by all the nodes during the operation of the network. Each node in the Ad Hoc network is assigned with a credit value which changes with the valid accusations messages sent by the nodes in the network, and the n nodes with the highest credit values automatically become servers. The security risks caused by the static server nodes in the traditional DPKG are avoided in the scheme and the leakage probability of the PKG's secret key is greatly decreased, so the ability of resisting attackers outside of the Ad Hoc networks is improved effectively.

      Safety of self-certified signcryption scheme 
      WANG Yun1,LU Dianjun2
      2015, 37(07): 1280-1283. doi:
      Abstract ( 140 )   PDF (399KB) ( 184 )     

      Signcryption scheme is an integration of signcrypted signature and encryption system,which is different from the traditional firstsignatureandthenencryption system,and which achieves both confidential and certified services for the transferred information within a logical step.Thus how to design a safe and efficient signcryption scheme is particularly important. We analyze the security flaws in the existing literature which do not meet unforgeable requirement,that is,there is a known forgery attack against the plaintext and ciphertext.Any third party can steal plaintext and ciphertext,and counterfeit signcrypter legitimate signatures for any messages,which seriously harms the interests of signcrypters.

      A novel self-growing network model and its algorithm 
      ZHANG Zhichang1,YAO Dongren1,LIU Xia2
      2015, 37(07): 1284-1289. doi:
      Abstract ( 145 )   PDF (588KB) ( 267 )     

      As is well known,most real world networks are not random networks.A few nodes usually have a lot of links while most nodes have few,and it is an important characteristic of scalefree networks,which is an important research topic.In this paper we firstly define a new type of selfgrowing network model, calculate its basic parameters,and validate its scalefree feature.We then prove that the degree distribution of the spanning tree with maximum leaves obeys the power law distribution and we get the balanced set of the network.Through those behaviors,we have a preliminary exploration on scalefree networks.Finally we propose an algorithm to calculate the average path length.

      Study on the performance of beaconenabled ZigBee
      networks based on OPNET simulation   
      LAN Yong1,CHEN Zhicheng2
      2015, 37(07): 1290-1296. doi:
      Abstract ( 128 )   PDF (4619KB) ( 217 )     

      Aiming at understanding how the superframe parameters impact the performance of ZigBee networks, we study the impact relationship based on the OPNET modeler and conduct simulations on network models with different superframe parameters.On the basis of the simulation results,we identify the interrelation between network performance and superframe parameters.Furthermore,based on the identified interrelation,we propose a configuration strategy of superframe parameters for ZigBee networks. Experimental results show that the impact of the superframe parameters on the performance of ZigBee networks is depicted more comprehensively by using the OPNET simulation method, and the proposed configuration strategy  provides a better configuration of superframe parameters for  ZigBee network applications.Key words: 

      An approach for ontology cohesion
      metrics based on directed acyclic graph 
      LIAO Lili,SHEN Guohua,HUANG Zhiqiu,KAN Shuanglong
      2015, 37(07): 1297-1303. doi:
      Abstract ( 114 )   PDF (564KB) ( 155 )     

      With the widespread use and the rapid development of ontologies, the structures and semantics of ontologies are becoming more and more complex. The evaluation of the qualities of ontologies becomes a major problem for ontology construction and reuse. In the process of ontology construction, ontology evaluation is beneficial to the reconstruction and optimization of ontologies of high quality. In the reuse process, ontology evaluation can help users to select the optimal structure among the candidate ontologies. In this paper we first propose a set of evaluation metrics to measure the cohesion of the ontologies based on directed acyclic graph (DAG). Then we validate the theoretical effectiveness of the proposed metrics under the frameworks for software measurement validation. Finally, we use a set of classical ontologies to do experiments and the results demonstrate that the proposed metrics are reasonable and effective. It is conducive to the ontology construction and reuse.

      AFMC algorithm for SVM parameter optimization  
      GAO Leifu,ZHAO Shijie,YU Dongmei,TU Jun
      2015, 37(07): 1304-1310. doi:
      Abstract ( 115 )   PDF (1315KB) ( 232 )     

      Support vector machine (SVM)parameter optimization selection is an important research direction, but there is still no systematic theory to guide the selection of the SVM parameters.Since optimizing the SVM parameters by the artificial fishswarm algorithm tends to fall into the small neighborhood of the approximate optimal solution,we design the AFMC algorithm  for the SVM parameter optimization.At the early stage,we use the better parallel optimization performance of the fishswarm algorithm to quickly gain the approximate optimal solution.Then we use the MonteCarlo algorithm for local searching to achieve a quick and effective strongapproximate optimal solution.The numerical experiments show that the proposed algorithm has better classification performance and faster searching speed,and it is effective and feasible in the SVM parameter optimization.

      Elitist learning strategy: an improved particle swarm
      optimizer algorithm for stack selection optimization 
      ZHANG Qiqi1,2,ZHANG Tao1,3,LIU Peng1
      2015, 37(07): 1311-1317. doi:
      Abstract ( 132 )   PDF (590KB) ( 176 )     

      To solve the stack selection problem in the integrated management of inventory and production for the iron steel enterprises,we construct a joint optimization model to balance the load of each stack and to maximize the slab comprehensive matching degree at the same time based on the Ashaped constraints,dispersive constraints et al.To help the solution jump out of the local optimum during the evolution when using the particle swarm optimization (PSO) algorithm, we introduce an elitist learning strategy,which can improve the solutions when the group converges.Finally,simulation results demonstrate the validity and feasibility of the proposed algorithm.

      Research on collaborative decision of multistage inventory
      in chain retail supply chain 
      XUE Hong1,XUE Jun2,QIU Bin1
      2015, 37(07): 1318-1324. doi:
      Abstract ( 109 )   PDF (1242KB) ( 295 )     

      Aiming at the dynamic optimization problem of multistage inventory resources allocation in chain retail supply chain,we put forth a duallevel programming model of multistage inventory,in which inventory strategy is optimized at the upper layer,and the logistics distribution schemes  at the lower layer.Using the local genetic operation characteristics of  the  finegrained genetic algorithms, we simulate the locality of microscopic group interaction,and study the collaborative decision of multistage inventory for chain retail supply chain  based on the Agent group behavior optimization algorithms of the fine-grained genetic algorithms and on the collaborative decision mechanism of the emergence mechanism of complex adaptive system.The validity of the model is verified by an example experiment.Simulation results show that from the view of system engineering, the dynamic optimization of multistage inventory resources allocation and information sharing in chain retail supply chain are realized by the group behavior optimization of microscopic individual Agents in chain retail supply chain,which can reduce the total cost of management and operation for multistage inventory.

      Software design and implementation of snow
      cover extraction based on FY-3A/VIRR data  
      ZHANG Yonghong1,2,CAO Ting2,REN Wei2 ,TIAN Wei3,WANG Jiangeng4,Yang Runzhi5,L
      2015, 37(07): 1325-1330. doi:
      Abstract ( 113 )   PDF (765KB) ( 228 )     

      So far little research has been carried out on the process of FY3A/VIRR data which has a huge amount of data.Current commercial remote sensing image processing softwares can not accomplish the preprocessing work directly.So it brings great difficulties in subsequent image processing and the spread of FY3A/VIRR data. In order to solve this problem,we combine a modified normalized difference snow index (NDSI) and the comprehensive threshold method with IDL and VB mixed technology,and design a snow information batch extraction software.We achieve the snow information extraction and accuracy validation of a single image or multiple images for FY3A/VIRR data.Experimental results show that the proposed software is fast and realtime,and it can do batch extraction of snow information,thus saving great human resources and improving the ability of dispatching and sharing the VIRR data. It can be extended in industrial production and automation in the future.

      High dynamic range image fusion
      based on camera response function 
      DU Lin,SUN Huayan,ZHANG Tinghua,WANG Shuai
      2015, 37(07): 1331-1337. doi:
      Abstract ( 180 )   PDF (1062KB) ( 398 )     

      Highdynamicrange imaging technique can provide a very efficient way of reconstructing the image of objects and the surrounding environment, so it is currently widely applied in the domain of military,aerospace science etc.and has shown significant value in academia.In this paper, a series of images of the same scenario under different levels of exposure are taken by Cannon 1DC.The camera response curve of the colorful images in different channels is depicted through careful computational deduction.The image with a high dynamic range is then obtained through the establishment of the radiance map between the gray value and the irradiance. Thirdly, the bilateralfilterbased tone mapping algorithm is applied again to conduct the compression of the highdynamicrange images. Finally, the color calibration is implemented by the specularreflectionbased white balance method. In the experiment, images with different frames are selected to demonstrate the efficiency of the proposed algorithm. Experimental results show that at least 4 frames are required to realize highdynamicrange image fusion.

      Improved SVM multiple classifiers
      for image annotation 
      WU Wei1,NIE Jianyun2,GAO Guanglai1
      2015, 37(07): 1338-1343. doi:
      Abstract ( 96 )   PDF (1396KB) ( 236 )     

      We propose a novel classifier for the multilabel image annotation task based on an improved SVM. We firstly define a histogram intersection distance for the SVM kernel function. Then, the original SVM output result is transformed to the distance between a given sample and the hyperplane. Additionally, a feature selection method is developed for our model, and we choose those visual features with small correlations between them to establish the SVM based classifier. Furthermore, on account of the uneven distribution of different image categories, we also introduce a probability weighted strategy in our SVM model. Experiments on ImageCLEF dataset not only confirm the effectiveness of the proposed model, but also show that the proposed feature selection method is very suitable for the classifier. Compared with the traditional classifiers, our method obtains the optimal results, and is competitive to the state-of-the-art methods.

      Classification and feature extraction of hyperspectral images
      based on improved minimum noise fraction transformation  
      BAI Lin,HUI Meng
      2015, 37(07): 1344-1348. doi:
      Abstract ( 147 )   PDF (611KB) ( 252 )     

      Based on minimum noise fraction transformation, we introduce a novel wavelet kernel method, which improves the minimum noise fraction transformation by replacing the traditional kernel function with the wavelet kernel function, for its feature of multi-resolution analysis can improve the nonlinear mapping capability of the kernel minimum noise fraction transformation method.The relevance vector machine classification of hyperspectral images is a new classification method which combines the novel kernel minimum noise fraction transformation with the relevance vector machine. Simulation results show that, the wavelet kernel minimum noise fraction transformation method reflects the nonlinear characteristics of the hyperspectral images. The proposed method is applied to the HYDICE data (shoot over in Washington DC Mall), and compared with the compare algorithm, its classification accuracy can be increased by 3%~8% and the classification precision of areas with small sample data can be improved effectively.

      A segmentation method for wheat leaf
      images with disease in complex background  
      ZHANG Wu,HUANG Shuai,WANG Jingjing,LIU Lianzhong
      2015, 37(07): 1349-1354. doi:
      Abstract ( 599 )   PDF (762KB) ( 189751 )     

      According to the properties of wheat disease (wheat stripe rust and wheat leaf rust) images in complex background, we propose a segmentation method based on the Kmeans clustering segmentation method and the Otsu threshold algorithm. The proposed method is comprised of three main steps. Firstly, by utilizing the difference between the background and the a*b* of leaves, the k-means clustering segmentation method deletes the irrelevant background. Secondly, we use the Otsu dynamic threshold segmentation method to perform binarization and combine the mathematical morphological method with the area threshold method to separate the wheat leaf image with disease from the complex background. Finally, the Kmeans segmentation method is used again and eventually the disease spot of the image is divided. Experimental results show that the correct extraction rate can reach 95%, and it can segment the diseased regions from the whole color image  with good robustness and good accuracy, thus offering an effective method for wheat disease detection and diagnosis.