Volume 2 Issue 5- May 2012


S.No Title Page
1. Optimization of the Edges Detected by Canny Operator through Segmentation
Megha Seth, Dr. Sipi Dubey

.A problem that occurs while applying the canny operator directly for edge detection on an image that consist of large number of boundaries is that their output becomes more complex because of large number of edges and due to complexity finding out a particular region of an image becomes very difficult. In this paper we are using K-means clustering algorithm to optimize the output of canny operator by segmenting the image into number of segments given by user as per the requirement for the application and then apply canny operator on the segmented image.

Keywords— Image segmentation, Global segmentation, Local segmentation Clustering, K – Means, Canny Operator, Centroid, RMSE.

Full Text PDF
2. Wireless Sensor Networks Application and Control in Industrial System: A Study Report
M.Chithik Raja

This Research article provides a application on implementing wireless sensor network (WSN) technology on industrial System monitoring and control. First, the existing industrial applications are explored, following with a review of the advantages of adopting WSN technology for industrial control. Then, challenging factors influencing the design and acceptance of WSNs in the process control world are outlined, and the state-of-the-art research efforts and industrial solutions are provided corresponding to each factor. Further research issues for the realization and improvement of wireless sensor network technology on systematic industry are also mentioned.

Keywords-wireless sensor Networks, Remote transducer, Wi-Fi, Topology, Control system

Full Text PDF
3. Burst Noise Detection and Cancellation in DS-CDMA
Chaitra S and Bhaskara Rao N

This paper presents a new algorithm for the detection and correction of certain type of burst noise spikes in DS-CDMA receivers.
Keywords— DS-CDMA, burst noise cancellation, Walsh – Hadmard orthogonal codes.

Ful Text PDF
4. Evaluating the Performance of the Students in the Technical Education through Data Mining Technique
Shiv Kumar Gupta,Dr. Ritu Vijay

In order to understand how and why data mining works, it’s important to understand a few fundamental  concepts.  First,  data  mining  relies  on  four  essential  methods:  Classification, categorization,  estimation, and visualization. Classification identifies associations and clusters, and  separates  subject  under  study.  Categorization  uses  rule  induction  algorithms  to  handle categorical outcomes, such as “persist” or “dropout,” and “transfer” or “stay”. Visualization uses interactive graphs to demonstrate mathematically induced  rules and scores, and is far more sophisticated than pie or bar charts. Visualization is used primarily to depict three-dimensional geographic  locations  of  mathematical  coordinates.  Higher  education  institutions  can  use classification, for a comprehensive analysis of student characteristics, or use estimation to predict the likelihood of a variety of outcomes, such as transferability, persistence, retention, and course success. In this paper we have  evaluated about the performance of the students through their scores obtained in different years as well as their dropout percentage of each year.

Keywords: Data Mining, Technical Education, Student Retention, Classification, Visualization.

Full Text PDF
5. Modified Sift Algorithm for Appearance Based Recognition of American Sign Language
Jaspreet Kaur ,Navjot Kaur

The concern of the paper is to investigate the application of the Scale-Invariant Feature Transform (SIFT) to the problem of  hand gesture recognition by using MATLAB. The algorithm uses modified SIFT approach to match key-points between the query image  and the original database of Bare Hand images taken. The extracted features are highly distinctive as they are shift, scale and rotation invariant. They are also partially invariant to illumination and affine transformations. All these properties make them very suitable to the problem at hand. Performance improvement for SIFT also has been proposed. Experimental results show the efficient performance of the developed algorithm in terms of recognizing all the images provided in the training set.

Keywords: SIFT, ASL Recognition, ASL using MATLAB, Image Processing


Full Text PDF
6. SignPro-An Application Suite for Deaf and Dumb
Ashish Sethi, Hemanth S,Kuldeep Kumar,Bhaskara Rao N,Krishnan R

This application helps the deaf and dumb person to communicate with the rest of the world using sign language. Suitable existing methods are integrated in this application. The key feature in this system is the real time gesture to text conversion. The processing steps include: gesture extraction, gesture matching and conversion to speech. Gesture extraction involves use of various image processing techniques such as histogram matching, bounding box computation, skin colour segmentation and region growing. Techniques applicable for Gesture matching include feature point matching and correlation based matching. We have come up four different approaches based on the methods used for gesture extraction and matching. A Comparative study of these approaches is also carried out to rank them based on time efficiency and accuracy.  The other features in the application include voicing out of text and text to gesture conversion.

Keywords—Bounding box approach, correlation approach, histogram matching, point matching algorithm, region growing approach, skin color segmentation and text to speech conversion application

Full Text PDF
7. Visualizing Survivability Aspects in WDM Networks
Arvind Kumar Singh, Pankaj Singh, Ajaz Ahmad, Akhilesh Kumar Singh ,Omjeet Singh , Rajan Mani Tripathi

The impetus focus is on to  provide high data transfer and utilization of high bandwidth, so the era of computer networks completed a journey from co-axial, twisted pairs cables to optical cables migrating from SONET/SDH to WDM technology based networks. This paper deals with various aspects concerned to WDM based networks in context to failure of fiber links in WDM optical networks and the survivability aspects in WDM based networks.

Keywords: Survivability, lightpath routing, maxflow min cut theorem, maximum survivable path set, computational complexity.



Full Text PDF
8. Performance Evaluation of 2D WxT Codes for OCDMA Systems in Noisy Environment
Preeti Mockoul , Shalini Dhingra

OCDMA is a multiple access technology for our next generation communication networks.The main issue of this paper is to devise a code set of good system performance. WxT -2D matrix codes for optical code-division multiple access (OCDMA) are of increasing interest because of their inherently high cardinality (code set size), high information spectral density (ISD), and ease of adapting WDM-like components for their implementation
The basic idea of constructing a 2-D code is to assign temporal locations (time chips) of optical pulses to each spatial channel (or wavelength) in such a way that for any two distinct code words, there is a coincidence of optical pulses only at one spatial channel (or wavelength).Performance Analysis of 2D WxT Optical Orthogonal Codes for Incoherent Optical CDMA system for various system and code parameters is carried out under Noisy Environment.

Keywords-Optical Code Division Multiple Access, Optical Orthogonal Codes, Wavelength x Time two-dimensional optical orthogonal code (WxT 2-D OOC).

Full Text PDF
9. Diffie–Hellman Type 2D Key Exchange Scheme Using 2D-Convolution
Priya Nandihal and Bhaskara Rao.N

A 2D Diffie–Hellman type key exchange method is presented. This new method uses 2D discrete convolution process to provide a secure shared 2D key between two users. It can be extended for more than 2 users.

Keywords— 2D key, key exchange, 2D discrete convolution, image as a key.

Full Text PDF
10. Optimum Placement of Additional Sink Nodes in a Wireless Sensor Network
Mamatha G and Dr B G Premasudha

A new heuristic algorithm for solving the conditional p-center problem is described. In the new algorithm, the initial locations of the p additional sink nodes are selected based on the greedy algorithm. Then the clusters are grown iteratively around the sink nodes which act as the seeds. In successive iterations, the locations of the p additional sink nodes are updated to minimize the maximum weighted distance from a sensor node to its nearest sink node. When all the sensor nodes are included in their respective clusters and after the final update of the p sink node locations, the iteration process is over.

Keywords—Wireless Sensor Network, conditional p-center problem, p-medoid clustering .

Full Text PDF
11. Flexible Provisioning and Integrated Load Balancing of Resources in the Cloud
Pranav Kurbet, Poornima A.B

Computing services that are provided by datacenters over the internet are now commonly referred to as cloud computing. Cloud computing promises virtually unlimited computational resources to its users, while letting them pay only for the resources they actually use at any given time.
Our goal is to build the next generation of resource management in cloud computing. We propose “Flexible Provisioning and Integrated Load Balancing of Resources in Cloud” where the cloud (provider) and the users build a symbiotic relationship. Instead of renting a set of specific resources, the user simply presents the job to be executed to the cloud. The cloud has an associated pricing model to quote prices of the user jobs executed.

Keywords— Greedy Scheduler, Deadline Division Scheduler, Central Monitor, Resource Allocator, Cloud Infrastructure.
Full Text PDF
12. A Comparative Study of Electrocardiographic Changes between Non smokers and Smokers
Amit Srivastava, Anuj Poonia, Suman Shekhar, R.P.Tewari

cardiovascular diseases were on the constant rise in developing countries. Cigarette smoking is a major risk factor for cardiovascular disease. In this study changes in ECG were evaluated in healthy adult male smokers and non smokers to identify the possible high risk factor for cardiovascular diseases. ECG was recorded in smokers and non smokers. Subjects age ranged from 22 to 30 years were selected. After taking consent and a detailed history from subjects, electrocardiogram was recorded during resting supine position. The ECG results are evaluated for different parameters like heart rate, P-wave, P-R interval, QRS complex, QT interval, and T-wave. Probability (p) values of 0.05 are less (p < 0.05) was considered for statistical significance. There is statistically significant increase in the heart rate and decrease in QRS complex and T-P interval in smokers compared to non-smokers. P-wave, P-R interval, QT interval, QRS complex, ST segment and T-wave does not show statistically significant results. Probability (p) value is calculated with the help of mean, standard deviation (SD), number of observation (N). Results showed that smoking potentially increase the risk of cardiovascular disease.  

Keywords— Electrocardiogram, Heart rate, QRS complex, T-P interval, Probability (p)

Full Text PDF
13. Enhancing Security in Cloud Access Mechanism by Establishing Covert Storage Channel
Amanjot Kaur, Amarpreet Singh

Moving computing into the “Cloud” makes computer processing much more convenient for users but also presents them with new security problems about safety and reliability. While accessing the data we follow a data policy known by third party. They can use our data. So to overcome this problem we purpose a new scheme that makes use of the covert channel technique  to secure the data being communicated between cloud server and client from the third party(any process/subject).

Keywords: Cloud Computing, Covert Channel, CSP, Virtualization.

Full Text PDF
14. A Sequence Analysis Approach to perform Resource Allocation in Mobile Networks
Geeta Rani, Rita Chhikara

A Mobile network is the interconnection of vast number of users over the network. Each user is assigned with same number of resources. Now such kind of network suffers from the problem of congestion as well as starvation. The proposed work is about to get the maximum outcome from the network. In this work we are finding the most frequent pattern   over the network between different kinds of users. In this work we are presenting an improved inverted table approach to find the most frequent pattern between the mobile users. We have showed the work to pass the input in terms of some mobile usage in terms of user access and we will find the user or the user group who is utilizing the network with maximum usage. The work is about the distribution of maximum resources among such users. The work is divided in two main modules. One to maintain the database in terms of Inverted table and second to resolve any user query on this inverted table to get the most appropriate results from the database.

Keywords: Mobile Network, Resource Allocation, Inverted Table, Bandwidth, Time Sharing

Full Text PDF
15. To Compare Encryption Algorithms on Application and Database Layer on the Basis of Computation Time
Jaskaran Kaur, Richa Sharma

In data communication, information security plays an important role. Encryption algorithms play a vital role in information security system. These algorithms consume a significant amount of computing resources such as CPU time, memory and battery power and computation time. This paper performs comparative analysis of two algorithm; RSA and transposition cipher on two levels- application level and database level considering a parameter such as computation time. Encryption at the database level ,and application level has proved to be the ideal method to protect sensitive data.

Keywords- Transposition Cipher, cryptography, database level encryption, application level encryption.

Full Text PDF
16. Compressing Video and Hiding Data in the same Compressed Video using the Error Predictions
AleeMahamed SK,A.Surendra Reddy

This paper deals with compressing and hiding messages. It may be a picture, may be a audio or a video. Unlike data hiding in images and loss of video which operates on the in the time or transformed domain. we target the candidate vectors  to encode and  to reconstruct both in forward direction and revere direction  by compressing the video and they are p and B frames .These motion vectors are based on their  prediction errors. that differs it from the  previous approaches based on the attributes such as the magnitude and phase,. A threshold is searched for every frame to achieve robustness as it is a low error level. The data bit stream is hided in the least significant bit of both components of the compressed video. . The evaluation is based on firstly low distortion to the new video and minimum overhead on the compressed video size. Based on the above mentions, the described method is found to be performed well and is compared to a motion vector attribute-based method from the literature.

Full Text PDF