Loading...
  • 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Current Issue

    • 论文
      Low power instruction cache design based on
      pipeline and sliding window structure  
      LI Wei,XIAO Jianqing
      2015, 37(06): 1037-1042. doi:
      Abstract ( 118 )   PDF (6381KB) ( 291 )     

      While the application of cache significantly improves the performance of the embedded processors, the cache, especially the I-cache, also consumes a large proportion of power. Reducing unnecessary accesses to the tag SRAM and the data SRAM can lower the power consumption. In this paper we design a pipeline I-Cache access mechanism that can deny the unnecessary access to the data SRAM. We also present a slide window of the cache lines by recording the information of the current introduction cache line and by predicting the information of the next cache line to reduce the unnecessary access to the tag SRAM. In the SMIC 90nm, the proposed method can achieve a 50% power reduction of the I-Cache without any performance degradation.

      An I/O scheduling algorithm of  SSD based on heavy-tailed distributions  
      WEI Yuanhao1,WU Xiaohua2,WU Qingbo1,SHAO Lisong1
      2015, 37(06): 1043-1046. doi:
      Abstract ( 128 )   PDF (722KB) ( 179 )     

      The size of the files stored in the network server has heavy-tailed feature. Access latency is dependent on the size of the accessed files, and the IO operations of Solid State Disk (SSD) are asymmetric. Therefore, based on the kernel NOOP scheduling algorithm, we propose an I/O scheduling algorithm of SSD based on the heavy-tailed distributions to improve the performance of SSD access by reducing the waiting time for lots of small files. Experimental results show that the proposed algorithm can reduce response time by 17% on average compared with the kernel NOOP scheduling algorithm.

      ACOCS:A hybrid resource scheduling
      algorithm in cloud environment  
      LI Huangda,CHENG Lianglun
      2015, 37(06): 1047-1052. doi:
      Abstract ( 138 )   PDF (1138KB) ( 196 )     

      The ant colony algorithm has great significance in solving composition optimization problems. However, the traditional ant colony scheduling algorithm has some drawbacks such as slow searching speed and easy to fall into local optimum. In view of this situation, combining the ant colony algorithm with the cuckoo search algorithm, we propose a hybrid algorithm (ACOCS) for resource scheduling in cloud environment. This method not only effectively preserves the high accuracy and robustness of the ant colony algorithm , but also integrates the rapid global search capability feature of the cuckoo search algorithm . Simulation results show that the proposed ACOCS scheduling strategy is better than the ant colony algorithm. It not only reduces the response time required for effective scheduling, but also improves the system resource utilization to some extent.

      Impact of well contact area on
      the PMOS SET pulse width   
      LIU Rongrong1,CHI Yaqing1,2,HE Yibai1,DOU Qiang1
      2015, 37(06): 1053-1057. doi:
      Abstract ( 112 )   PDF (902KB) ( 222 )     

      We analyze the impact of well contact area on the PMOS SET pulse width in nanotechnology using TCAD simulations. Simulation results show that in nanotechnology, increasing the well contact area can broaden the SET pulse width due to the pulse quenching effect, which contradicts the traditional view that increasing the well contact area can effectively mitigate the SET pulse. Meanwhile, the tendency of this phenomenon under different incident particle LET values and transistor gaps is also analyzed.

      Symbolic driver environment:
      a tool aided to detect Linux driver bugs 
      FAN Wenliang,MAO Junjie,XIAO Qixue,XU Yongjian,YANG Weikang,CHEN Yu
      2015, 37(06): 1058-1063. doi:
      Abstract ( 128 )   PDF (511KB) ( 176 )     

      It has been proved that Linux driver bugs are the major bug source of the whole system, which can lead to serious security problems.A tool called symbolic driver environment (SDE) is designed to detect Linux driver bugs, which consists of the system model,the interactions between driver and kernel, and the interactions between driver and device.Using SDE, we detect two real Linux drivers, and find two bugs. The results prove that the tool is feasible, and the speed is 90% faster and the coverage is 20% larger compared with an existing tool called SymDrive.

      An energy-aware routing algorithm
      based on ARIMA-ANN forecasting model  
      CAI Zhao,MA Linhua,SONG Bo,TANG Hong
      2015, 37(06): 1064-1070. doi:
      Abstract ( 115 )   PDF (5148KB) ( 202 )     

      Aiming at the problem that the traditional energy-aware OLSR protocol cannot reduce transmission power consumption and balance the residual energy between nodes at the same time, we develop a new routing protocol called OLSR routing protocol based on residual energy ratio and transmission power consumption (OLSR_RC ). A composite energy cost involving the above two indicators is constructed, and is used as a routing metric. On one hand, the OLSR_RC protocol reduces the total power consumption of the entire network. On the other hand, it prevents the energy of the lowenergy nodes from being depleted rapidly. In addition, we adopt the hybrid ARIMAANN model for forecasting residual energy level of the nodes, which can reduce the influence on route selection caused by topology control(TC) message loss. The new routing protocol has wide application prospects in wireless sensor networks.

      Fusing correlation based multi-task
      compressive sensing for activity recognition  
      DUAN Mengqin,LI Renfa,HUANG Jing
      2015, 37(06): 1071-1078. doi:
      Abstract ( 123 )   PDF (1277KB) ( 334 )     

      Sensor-based human activity recognition is an emerging research field. It is an important application of Internet of Things (IoT), and has  very promising application prospects in health care/recovery, elder/invalid people assistant, smart home/office, etc. Accuracy is one of the most important evaluation standards of activity recognition, and appropriate features and classifiers are important accuracy factors. We first extract a novel feature called correlation feature. By combining the theory of compressive sensing and sparse representation, we propose a multi-task compressive sensing method and use it as the classifier to resolve the problem of activity recognition. Finally, we conduct a large amount of experiments on a set of benchmarks with LeaveOneSubject-Out cross validation. Experimental results show that the extracted feature and the proposed method are effective in improving the accuracy of sensor-based activity recognition. Moreover, compared to the corresponding single task method, the proposed classifier can reduce the execution time by nearly 56%.

      Risk assessment selection based on static Bayesian game        
      YU Dingkun,WANG Jindong,ZHANG Hengwei,WANG Na,CHEN Yu
      2015, 37(06): 1079-1086. doi:
      Abstract ( 277 )   PDF (769KB) ( 89192 )     

      Nowadays,most studies of risk assessment methods based on game theory use complete information game model.These models cannot deal with the situation in which the attacker and the defender do not know each other’s actions.In this paper we establish an attackdefense model based on static Bayesian game theory to categorize the attacker and the defender into different types.Then,we analyze the Bayesian equilibrium of the game comprehensively,and improve the taxonomy and cost quantitative method of the classical strategies with consideration of the strike back acts of the defender and the success rate of attacks.Under the premise that the actions of the attacker are predicted based on game equilibrium,we use the risk calculating algorithm to calculate the risk in the information system. Simulation results  prove the effectiveness of the proposed model and the algorithm.

      A routing algorithm for information
      leakage reduction in Ad-hoc networks
      LIU Yujun,WANG Minghui,CAI Meng,CHEN Kun
      2015, 37(06): 1087-1092. doi:
      Abstract ( 98 )   PDF (780KB) ( 196 )     

      We analyze the approaches and causes of information leakage during information transmission in Adhoc networks, design an Adhoc network model based on the leaked information, and propose a routing algorithm for information leakage reduction. Based on the weighted graph model, we add the node location information and authentication in order to reduce information eavesdropping by external users and un-trusted users within the group, and to make the information forwarding of trusted users the priority, thus reducing the probability of information leakage. After calculating the leakage probability of the un-trusted nodes, we select the node with minimum probability of information leakage as the forwarding node, and we build a controllable forwarding node set  to ensure the minimum probability of information leakage. Finally, according to the constraints of algorithm design, we take the main factors that impact the algorithm performance as the performance evaluation indicators, and the simulation results prove the superiority of the proposed route algorithm in information leakage.

      A robust color image watermarking algorithm
      based on 3D-DCT and SVD 
      XIONG Xiangguang,WEI Li,XIE Gang
      2015, 37(06): 1093-1100. doi:
      Abstract ( 124 )   PDF (758KB) ( 218 )     

      In the field of digital image watermarking, watermarking algorithms mainly focus on gray images, whereas most of color images watermarking algorithms usually embed watermarking only in the luminance component or in each channel of color image separately. These algorithms are unable to make full use of the redundant space of color images, thus affecting the watermarking’s transparency and robustness. To solve these problems,we propose a novel color image watermarking algorithm based on the three-dimensional discrete cosine transform (3DDCT) and the singular value decomposition (SVD). Firstly, the watermarking image is pre-processed and the color image is subdivided to nonoverlapping blocks.Secondly, we perform the 3DDCT for each block and apply the SVD in the first component of the 3D-DCT coefficients. We use the quantization method and the relationship method to embed watermarking in the largest singular value and the second component of 3D-DCT coefficients respectively at the embedding watermarking stage.At the watermarking extraction stage, we extract the watermarking image using the quantization method and the relationship method separately and the results are compared. The result with higher correlation value is chosen as the final extracted watermarking. Experimental results show that the proposed algorithm has good transparency and the ability to resist common signal processing and fuzzy, distorted, sharpening attacks.Key words:

      Research on optimal actors’relocation in wireless
      sensor and actor networks 
      ZHAO Xinyuan
      2015, 37(06): 1101-1108. doi:
      Abstract ( 103 )   PDF (957KB) ( 232 )     

      The connectivity of actor networks is vital in the wireless sensor and actor networks (WSANs) due to the nature of the WSANs operation. Relocating actors is an effective solution to restore the connectivity when actors fail.The relocation solutions proposed in recent studies do not optimize the relocation distance. In this paper we present a network flow based multiobjects actor relocation model to handle multiple actors' failures. Minimizing the total overheads and the individual maximal overhead are the two optimal objectives.To restore the connectivity of the network, the model constructs a flow balancing condition for the actor network which can be treated as a transportation network.The simulation results demonstrate the effectiveness of the proposed model and show that the model outperforms other existing approaches in terms of optimal restoration overheads in handling multiple failures.

      A software credibility measurement model based
      on test data and D-S theory 
      SUN Jiaze1,2,WANG Shuyan2
      2015, 37(06): 1109-1113. doi:
      Abstract ( 95 )   PDF (574KB) ( 194 )     

      Considering that the objectivity of software credibility measurement model is not strong, we present a testingbased software credibility measurement and assessment model, which uses software testing process data as the credible evidence. In the model, the key credible attributes of the software testing process are selected and quantitatively measured according to the software test procedure and the Capability Maturity Model (CMM) for software.Finally,the dispersive confidence indexes are fused by the Dempster/Shafer (D-S ) evidence theory,and the software credibility is achieved.Classical examples show that the model is effective in software credibility measurement.Key words: 

      A UML model-based analysis approach for 
      provenance-aware access control policies  
      SUN Lianshan1,QI Zhibin2,HOU Tao1
      2015, 37(06): 1114-1126. doi:
      Abstract ( 102 )   PDF (4055KB) ( 164 )     

      Provenance is the historical metadata of data objects. It has recently been used to enable provenancebased access control (PBAC), which grants or denies an access request according to the provenance of either the subjects or the objects. However, provenance can only be collected at runtime via complex directed acyclic graphs, so it is very difficult for security architects to efficiently specify PBAC policies due to the complexity of provenance graphs and its unavailability at design time. We explore a UML modelbased approach to analyze PBAC policies. Specifically, we first introduce a conceptual provenance model to shield the complexity of the provenance graphs and to enable policy analysis at the design time. We then introduce a UML modelbased process to guide the analysis of the conceptual provenance model and the PBAC policies along with the objectoriented development. We validate the proposed approach within an enterprise online training system.

      An ontology matching algorithm based on artificial immunity 
      DONG Jide,XIE Qiang,DING Qiulin
      2015, 37(06): 1127-1134. doi:
      Abstract ( 142 )   PDF (873KB) ( 198 )     

      Because of the low efficiency and low accuracy existing in the traditional ontology matching algorithms, we introduce an automatic ontology matching algorithm based on artificial immunity to rapidly get the required subontology from the existing ontology pool. The algorithm constructs an antigen ontology model according to the information of users’ behaviors, determines its domain context by matching the context, obtains the ontology with the highest matching degree via structure matching, and finally gets the right ontology through semantic matching. The experimental results show that the algorithm can improve the precision and the efficiency of ontology matching.

      Research and simulation of kernel function
      selection for support vector machine  
      LIANG Liming,ZHONG Zhen,CHEN Zhaoyang
      2015, 37(06): 1135-1141. doi:
      Abstract ( 115 )   PDF (603KB) ( 259 )     

      Support vector machine (SVM)is a kind of learning method based on kernel. Kernel function selection has significant influence on the performance of SVM, so how to effectively select the kernel function is an important problem in the field of SVM research. At present most of the kernel function selection methods neither consider the characteristics of data distribution, nor make full use of the implicit prior information in the data. We introduce a concept of energy entropy; along with the super sphere description and measurement features of kernel function, we put forth a method of kernel function selection based on the energy entropy of sample distribution so as to improve the learning ability and generalization ability of SVM. Simulations on numerical examples show that the method is feasible and effective.

      A non-monotonic trust region algorithm for solving
      complementary support vector machine  
      GAO Leifu,YU Dongmei,ZHAO Shijie
      2015, 37(06): 1142-1147. doi:
      Abstract ( 102 )   PDF (503KB) ( 167 )     

      Solving a large-scale convex quadratic programming problem is the core of the support vector machine. An equivalence complementarity problem can be obtained based on an amended model of the surpport vector machine(SVM), therefore we propose a non-monotonic trust region algorithm for solving the complementarity problem based on the Fischer-Burmeister complementarity function. The new algorithm need not compute any Hesse or the inverse matrix, thus reducing the amount of computational work. Global convergence of the algorithm is proved without any assumptions. Numerical experiments show that the running speed of the algorithm is faster than that of the LSVM algorithm and the descent algorithm when solving largescale nonlinear classification problems and thus it provides a feasible method for solving SVM.

      Image preprocessing for RMB serial number recognition  
      FENG Boyuan1,REN Mingwu1,ZHANG Xuyao2,YANG Jingyu1
      2015, 37(06): 1148-1153. doi:
      Abstract ( 138 )   PDF (809KB) ( 208 )     

      In recent years, the research on RMB (renminbi bank note, the paper currency used in China) serial number recognition has drawn more and more attentions. It has promising applications in financial crime reduction, financial market and social stabilization. The accuracy of RMB recognition relies heavily on the image preprocessing results. In this paper, we propose an entire preprocessing system, including the steps of RMB image sampling, skew correction, identification of RMB orientation, serial number region detection and binarization, and character extraction. Experimental results demonstrate that the proposed method achieves high precision and facilitates the subsequent character recognition task.

      Automatic detection of cell nucleus in pathological images
      based on multi-curvature contour features  
      ZHANG Yi1,PANG Baochuan2
      2015, 37(06): 1154-1160. doi:
      Abstract ( 171 )   PDF (806KB) ( 203 )     

      Nucleus automatic detection is not only an important part of pathological image analysis, but also one of the primary bottlenecks to improve the analysis accuracy of the technology. The reason lies in that the pathological slice exhibits uneven layering and staining, nucleus crowding or overlapping. In order to improve the accuracy of the nucleus detection, we design a nucleus model of multi-curvature contour. The nucleus contour features are extracted by multicurvature orientation energy filter. Then the contour features, along with the cell nucleus ground truth, are put into a boosting algorithm, which creates a pixel classifier to classify pixels into foreground and background. In the end, the mean-shift algorithm produces the confidence coefficiency of the nucleus for each location. Experimental results show that in comparison with other state-of-art cell nuclei detection methods, our method is more robust to conditions such as biological variations, different staining and illumination conditions, and touching or partial occlusions.

      A robust tracking algorithm based on improved Mean Shift   
      XU Haiming1,HUANG Shan1,2,LI Yuntong1
      2015, 37(06): 1161-1167. doi:
      Abstract ( 91 )   PDF (723KB) ( 173 )     

      The Mean Shift algorithm has a defect in handling moving targets with large scale change or being obscured. In order to solve this problem, we propose a bandwidthadaptive and antiblocking tracking algorithm based on multi-level square matching and fragment. The proposed algorithm uses the centroid deviation of the target model and the bandwidth trials method to compute the possible scales. The motion trend of the target is predicted through the multilevel square matching method, and the scale of the largest Bhattacharyya distance of the candidate targets is selected as the new bandwidth of the Mean Shift kernel function. At the same time, we divide the target into several fragments, adaptively change their weights according to the degree of being obscured, and then fuse the results of effective fragments under certain rules. Experimental results show that this algorithm has good robustness performance on tracking targets.

      Precise positioning of laser tags in largescale
      steel plate measurement  
      QIAN Qiang,PANG Linbin,WANG Zhi,BAI Suqin
      2015, 37(06): 1168-1174. doi:
      Abstract ( 164 )   PDF (697KB) ( 239 )     

      In order to achieve better accuracy and speed in noncontact measurement based on vision, the camera measuring technologies and the laser measuring technologies can be integrated. In the registration process of the two measuring systems, the precise positioning of a particular marker (such as laser tags) is the key technology to improve the measurement accuracy.We propose a precise positioning method of laser tags. First, the GPUbased Histogram of Oriented Gradients (HOG) algorithm is used to detect the laser tags on the steel plate.Then,the image binarization is used for further precise positioning based on the centroid method. In the process of laser tag detection,the very large image can be detected in real time to meet the requirements of industrial measurement due to the parallel computing advantage of GPU.In the process,a novel binarization method based on Difference of Gaussian (DOG) is proposed,which can obtain better results than the traditional methods such as Otsu and Bernsen. Experimental results show that the accuracy of the proposed method can reach 1 mm or less.