IJCST Logo

 

INTERNATIONAL JOURNAL OF COMPUTER SCIENCE & TECHNOLOGY (IJCST)-VOL III ISSUE IV, VER. 4, OCT. TO DEC., 2012


International Journal of Computer Science and Technology Vol. 3 Issue 4, Ver. 4
S.No. Research Topic Paper ID
   130 Digital Filtering of Electric Motors Infrared Thermographic Images

Anna V. Andonova, Nadezhda M. Kafadarova

Abstract

Image filtering plays an important role in post-processing and analysis of thermographic images. The place of digital filtering in the overall imaging in quantitative thermography is analyzed in the paper. There is proposed a technological scheme for a thermographic study of asynchronous electric motor for use in electromobile engineering. The results from the implementation of digital filtering in the MATLAB environment are presented. Four types of digital filters are thoroughly studied. The data is analyzed and recommendations are made in what cases one or another filter to be preferred, depending on the chosen goal and desired result.
Full Paper

IJCST/34/4/A-1225
   131 An Evaluative Study to Measure Teacher’s Competency in using Word Processing Applications in Teaching Learning Process with Special Reference to Selected English Medium Schools of Karad City in Satara District

S. P. Shinde, Dr. B. S. Sawant

Abstract

Today’s is the world of Information Technology. Information technology is becoming a part of our day to day life. So this IT field has to be incorporated in Education field also. Many schools, colleges and universities all over the world have already adopted this technology in teaching and learning process. But only adoption of such technology does not make it effective. Its actual implementation makes it effective. A teacher is a building block of any education system. A teacher cannot be replaced by any other means. In India teachers are worshipped next to God. So in order to implement computers in teaching learning process, a teacher should be competent enough to effectively and efficiently use computer applications in teaching learning process. So this paper evaluates the competency level of teachers in using Word processing applications in their teaching and learning.
Full Paper

IJCST/34/4/A-1226
   132 Performance Evaluation and Optimization of ODMRP Routing Protocols over 802.11 Based Wireless Mobile Ad-Hoc Network

Nagendra Sah, Neelam Rup Prakash, Deepak Bagai

Abstract

ODMRP protocol offers a very effective routing mechanism in wireless mobile Ad-hoc networks. But the performance deteriorates when the network is subjected to large amount of traffic. We have proposed a modification in the ODMRP protocol. The congestion in the network is controlled by reducing the RREQ packet retransmission by the source node, managing buffer space. The simulation has been done on the Qualnet 5.0 Simulator. The result shows the significant improvement in the throughout as well as reduction in number of packets dropped and jitter.
Full Paper

IJCST/34/4/A-1227
   133 Large Scale Image Search using Visual Reranking by Soft Computing Approach

Pushpanjali. M. Chouragade, Dr. P. N. Chatur

Abstract

Image search has become a popular feature in many search engines, including Yahoo!, MSN, Google, etc., majority of which use very little, if any, image information. The explosive growth and widespread accessibility of community-contributed media content on the Internet has led to a surge of research activity in visual search. However, it remains uncertain whether such techniques will generalize to a large number of popular Web queries and whether the potential improvement to search quality guarantees additional computational cost. Due to the success of text-based search of Web pages and in part, to the difficulty and expense of using image-based signals, most search engines return images solely based on the text of the pages from which the images are linked. No image analysis takes place to determine relevance/ quality. This can yield results of inconsistent quality. So, such kind of visual search approach has proven unsatisfying as it often entirely ignores the visual content itself as a ranking signal. To address this issue, visual re-ranking, defined as reordering of visual images based on their visual appearance can be used. The major advantages of this approach is that, it requires little human intervention and improves the search performance.
Full Paper

IJCST/34/4/A-1228
   134 Predicting the Mental Status of a Patient and Suggesting the Care to the Patient by Designing a Tool through Soft-Computing Techniques

Anjali-Mathur, P. C. Gupta

Abstract

To diagnose the mental status of a patient a doctor makes use of EEG reports. The analysis of these EEG reports is a technical task. In this work a tool is designed to predict the mental status and suggest the possible care to the patient.
Full Paper

IJCST/34/4/A-1229
   135 Simulation and Performance Analysis of Unicasting Routing Protocol in Wireless Manet

Kavita Nain, Dinesh Arora, Nagendra Sah

Abstract

MANET is the emerging technology in the field of wireless network. These technologies can be used whenever generally national calamity destroys the established network. The routing protocol plays the important role in improving the performance of the MANET. There are three different types of routing protocols such as Proactive, Reactive and Hybrid. These protocols have various characteristics depending on ways in which the routes of different nodes in the networks are connected. In this synopsis work, we will discuss the performance of different routing protocols in MANET. The simulation of this work can be done in Qualnet 5.0 and it will be compared with the performance achieved with NS- 2 software. It is expected result obtained by Qualnet will be better and improved.
Full Paper

IJCST/34/4/A-1230
   136 A New Square Shaped Fractal Dipole Antenna for Multiband Application

Rajni Bala, Dr. Jaswinder Singh

Abstract

In this paper the design of square shape multiband dipole antenna using fractal geometry is described. The fractal antenna has been designed on substrate FR-4 having thickness h=1.4mm, = 4.4. Ansoft HFSS 13 software has been used to design and simulate the antenna using different iterations. The antenna exhibits multiband resonances due to the self similarity in its structure. It is observed that with increase in iterations there is an increase in number of resonance frequencies along with an improvement in return loss and VSWR. The experimental result indicates that in final iteration the resonance frequencies obtained are 2.75GH, 3.75 GHz, 4.75 GHz, 5.15 GHz and 6.15 GHz with VSWR 1.37, 1.25, 1.25, 1.13 and 1.14 respectively. The designed square fractal antenna is also compared with Sierpinski dipole antenna and it is observed that more resonance frequencies are obtained in case of square fractal dipole antenna with improved values of return loss and VSWR. Full Paper

IJCST/34/4/A-1231
   137 Progressive Supervising of Distance Based Range Queries With Minimal I/O Cost

A. V. Seetha Lakshmi, Dr. S. P. Victor

Abstract

We focus on the distance based range queries that continuously change their locations in a Euclidean space. We present an efficient and effective monitoring technique based on the concept of a safe zone. Given a positive value r, a distance based range query returns the objects that lie within the distance r of the query location The safe zone of a query is the area with a property that while the query remains inside it, the results of the query remain unchanged. Hence, the query does not need to be re-evaluated unless it leaves the safe zone. We propose a technique based on powerful pruning rules and a unique access order which efficiently computes the safe zone and minimizes the I/O cost.
Full Paper

IJCST/34/4/A-1232
   138 Wireless Sensor Network and Its Security Concerns

Rohit Tiwari, Dr. Ashok Jetawat

Abstract

Wireless Sensor Networks (WSNs) are used as one of the greatest vital part to collect information. The WSNs, which have speed and are simple to install and maintain, are going to survive in the real world scenario. Although in the earlier WSNs the security concerns are overlooked, but sensor networks are always used to deal with extremely sensitive content and such networks do not have much interaction with external environment while operating. In this paper first we are discussing the WSN architecture and its core element that is, Wireless sensors. Then we are emphasizing on the need for Wireless Sensors and what are the factors of consideration while selecting a wireless sensor. We also focus on discovering some major security concerns for WSN and proposed a few solutions for the same.
Full Paper

IJCST/34/4/A-1233
   139 Improving Web Search Result using Semantic Similarity with Re-ranking

Rutuja Ajmire, A. V. Deorankar, Dr. P. N. Chatur

Abstract

The Aim of this paper is to re-ranking the web search result using semantic similarity to improve the quality of search engines. First obtain top N results returned by search engine such as Google, and then use semantic similarities between the Content obtain from the web search result and the users query. A semantic similarity algorithm based on WordNet ontology which is used to calculate the similarity of each snippet to each of the return result. And then based on the similarity re-ranking is performed. A balanced similarity ranking method combined with Google’s rank. Here first we convert the ranking position to an importance score for semantics instead of keyword matching which can better adapt timeliness of the pages is used to rank these Web pages.
Full Paper

IJCST/34/4/A-1234
   140 Dynamic Data Storage and Public Auditability for Cloud Storage Security

B Raj Kumar Rathod, Sravan Kumar Mateti, Dr. G. Narsimha

Abstract

IT has moved into next generation with cloud computing being realized. The way application software and databases are stored has been changed. Now they are stored in cloud data centers in which security is a concern from client point of view. The new phenomenon which is used to store and manage data without capital investment has brought many security challenges which are not thoroughly understood. This paper focuses on the security and integrity of data stored in cloud data servers. The data integrity verification is done by using a third party auditor who is authorized to check integrity of data periodically on behalf of client. The client of the data gets notifications from third party auditor when data integrity is lost. Not only verification of data integrity, the proposed system also supports data dynamics. The work that has been done in this line lacks data dynamics and true public auditability. The auditing task monitors data modifications, insertions and deletions. The proposed system is capable of supporting both public auditability and data dynamics. The review of literature has revealed the problems with existing systems and that is themotivation behind taking up this work. Merkle Hash Tree is used to improve block level authentication. In order to handle auditing tasks simultaneously, bilinear aggregate signature is used. This enables TPA to perform auditing concurrently for multiple clients. The experiments reveal that the proposed system is very efficient and also secure.Index Terms: Cloud computing, public auditability, cloud storage, cloud service provider
Full Paper

IJCST/34/4/A-1235
   141 Monitoring APIs for Live Audio-Video Conferencing

Shruti S. Kulkarni, Dr. Prashant N. Chatur

Abstract

Video conferencing & telepresence technology is an inexpensive enabler for companies to improve collaboration and reduce costs. As more and more companies adopt video conferencing as an integrated component of their daily business, they need a way to ensure that not only this technology, but this video conferencing service is running optimally. The Traverse comprehensive and unified monitoring platform for video and call quality in addition to the network performance allows enterprises to get faster ROI and drives lower TCO for their integrated video conferencing services. Monitoring audio video conferences helps an enterprise to manage the A-V conferences. The monitored data collected with the monitoring software guides the business processes in various decision making aspects.
Full Paper

IJCST/34/4/A-1236
   142 Augmented Reality on Cloud Computing Through Concept Similarity

J. Srinivasa Rao, G. Sudhir

Abstract

This paper contains a comparative study about publishing the disclosure limitation guarantees and the theoretical and practical utility of various approaches. Our comparison includes earlier work on anonymity and in distinguishes ability and our proposed solution to achieve probabilistic differential privacy in search logs. In our comparison, we revealed interesting relationships between in distinguish ability and probabilistic differential privacy which might be of independent interest. Our results (positive as well as negative) can be applied more generally to the problem of publishing frequent items or item sets .Our paper concludes with a large experimental study using real applications where we compare ZEALOUS and previous work that achieves k-anonymity in search log publishing. Our results show that ZEALOUS yields comparable utility to k-anonymity while at the same time achieving much stronger privacy guarantees. Two agents previously unknown to each other cannot communicate by exchanging concepts (nodes of their own ontology): they need to use a common communication language. If they do not use a standard protocol, most likely they use a natural language. The ambiguities of it, and the different concepts the agents possess, give rise to imperfect understanding among them. The use of data stored in transaction logs of Web search engines, Intranets, and Web sites can provide valuable insight.
Full Paper

IJCST/34/4/A-1237
   143 Comparison of Various Feature Detectors and Outliers Removal

Ann Therese Francy, D. Raveena Judie Dolly, Renu Maria Mathews

Abstract

Image registration is the process in which two or more images of the same scene taken at different times are overlayed. For registering two images, preliminary steps are feature detection, matching and control point detection. For feature detection and matching there are many point detectors. The common among them are Scale Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF).The control points are detected using SIFT, SURF, Correlation based matching and then the results are compared using the number of keypoints and matches and the results are considered for further process. Then the outliers are removed by using RANSAC algorithm.
Full Paper

IJCST/34/4/A-1238
   144 Load Pattern Analysis of Electricity Customers based on Clustering Algorithm

Rupali Meshram, A. V. Deorankar, Dr. P. N. Chatur

Abstract

Electricity load forecasting has been an important risk management and planning tool for electric utilities. Load forecasting is necessary for economic generation of power. Classification of load pattern is an important task for load forecasting of customers and grouping them into classes according to their load characteristics. The different unsupervised clustering algorithms (modified follow-the-leader, k-means, fuzzy k-means and two types of hierarchical clustering) and the Self Organising Map to group together customers having a similar electrical behaviour. In the approach, all load curves of customers are first clustered with the clustering algorithms under a given number of clusters. This paper shows supervised and unsupervised algorithms for classification of electricity customers.
Full Paper

IJCST/34/4/A-1239
   145 Performance Analysis of Topological Variation in Personal Area Network using ZigBee Wireless Sensors

Lovish Jaiswal, Jaswinder Kaur, Gurjeevan Singh

IJCST/34/4/A-1240
   146 Imaged Based Morphological Study of Bryophytes and Green Algae for Identifying Evolutionary Relationship between Them

P. Sanyal, S. K. Bandyopadhyay

Abstract

It is well-known that all plants of the land evolved from a green algal ancestor. Bryophytes represent the first step in the evolution of plants from algae and also it is the first step in the transition of multicellular autotrophic eukaryotes from water onto the land. The latest phylogenetic studies prove that the bryophytes as a group share a common ancestor with green algae. Also, the DNA-study supports this close evolutionary relationship between bryophytes and green algae. A few other studies such as cell wall elongation, origin of cuticle etc. establish that both bryophytes and algae share a common ancestor. This paper presents the findings of our latest image based morphological studies on evolutionary relationship between bryophytes and green algae based on their morphologies. Features used in this study include compactness, area, perimeter and number of serrations of sample images of thallus of these two species.
Full Paper

IJCST/34/4/A-1241
   147 Analysis and Prevention of Black Hole Attack in Ad-Hoc Networks

Ranjeet Suryawanshi, Veeresh P M

Abstract

Mobile ad-hoc networks are prone to a number of security threats. The fact that mobile ad-hoc networks lack fixed infrastructure and use wireless link for communication makes them very susceptible to an adversary’s malicious attacks. Black hole attack is one of the severe security threats in ad-hoc networks which can be easily employed by exploiting vulnerability of on-demand routing protocols such as AODV. In this paper, we have proposed a solution based on Intrusion Detection System (IDS) in NS2 to prevent black hole attacks imposed by both single and multiple black hole nodes. Result of a simulation study proves the particular solution maximizes network performance as well as effectively preventing black hole attacks against mobile ad-hoc networks.
Full Paper

IJCST/34/4/A-1242
   148 Object Based Geometrical and Texture Feature Extraction of Face from Front View

Arun Kumar Nagdeve, Somesh Kumar Dewangan

Abstract

Face recognition has been a fast growing, challenging and interesting area in real-time applications. A large number of face recognition algorithms have been developed for decades. For recognition of face, feature extraction plays a crucial role. This paper presents the method for extracting the geometric and texture feature of face automatically from the front view, cumulative histogram approach is used for extracting geometric feature while co-occurrence matrices are used for extracting the texture feature of face. From the input image, face location is detected using the viola-Jones algorithm, from which different Object such as left eye, right eye, nose, and mouth area are cropped and is processed. For geometric feature extraction histogram of each Object is computed and its cumulative histogram values are employed by varying different threshold values to create the binary image of each Object, then simple linear search technique is applied to detect the corner end point of each object. For texture feature extraction, co-occurrence matrices of each object is determined, using this co-occurrence matrix, angular second moment, entropy, maximum probability of occurrence pixels, inverse difference, inverse difference moment , mean, contrast of each object is computed.
Full Paper

IJCST/34/4/A-1243
   149 Efficiently Identifying the Data Leakages

K. Sudhakara Babu, P. Sudhakar Babu

Abstract

Data leakage is defined as the accidental or unintentional distribution of private or sensitive data to an unauthorized entity. Sensitive data in companies and organizations include intellectual property (IP), financial information, patient information, personal credit-card data, and other information depending on the business and the industry. Data leakage poses a serious issue for companies as the number of incidents and the cost to those experiencing them continue to increase. Data leakage is enhanced by the fact that transmitted data (both inbound and outbound), including emails, instant messaging, website forms, and file transfers among others, are largely unregulated and unmonitored on their way to their destinations. Furthermore, in many cases, sensitive data are shared among various stakeholders such as employees working from outside the organization’s premises (e.g., on laptops), business partners, and customers. This increases the risk that confidential information will fall into unauthorized hands. Whether caused by malicious intent or an inadvertent mistake by an insider or outsider, exposure of sensitive information can seriously hurt an organization.The Data Leakage problem can be defined as any unauthorized access of data due to an improper implementation or inadequacy of a technology, process or a policy. The “unauthorized access” described above can be the result of a malicious, intentional, inadvertent data leakage, or a bad business/technology process from an internal or external user. Traditionally, this leakage of data is handled by water marking technique which requires modification of data. If the watermarked copy is found at some unauthorized site then distributor can claim his ownership. To overcome the disadvantages of using watermark [2], data allocation strategies are used to improve the probability of identifying guilty third parties. In this project, we implement and analyze a guilt model that detects the agents using allocation strategies without modifying the original data. The guilty agent is one who leaks a portion of distributed data. The idea is to distribute the data intelligently to agents based on sample data request and explicit data request in order to improve the chance of detecting the guilty agents. The algorithms implemented using fake objects will improve the distributor chance of detecting guilty agents. It is observed that by minimizing the sum objective the chance of detecting guilty agents will increase. We also developed a framework for generating fake objects.
Full Paper

IJCST/34/4/A-1244
   150 Intelligent Text Data Compression and Encryption Technique for Secured Data Transfer

Ravindra Kale, V. A. Gulhane

Abstract

The amount of data that is exchanged due to the widespread of internet has increased in the last few years in leaps and bounds. This large amount of data not only requires more storage but also consume large bandwidth. Thus data compression techniques are particularly useful in communications because it enables devices to transmit or store the same amount of data in fewer bits. Further the increase in modern day services such as on-line shopping, stock trading, eticket, ebanking and electronic bill payment, often require data confidentiality. The use of efficient encryption algorithm can cater to the security aspect. A large number of compression and encryption algorithms have been proposed in the last few decades. This paper presents an efficient compression and encryption model to resolve these issues of large size of data and data security.
Full Paper

IJCST/34/4/A-1245
   151 Horizontal Aggregations using Generalized Projections

J. Sarvani, D. Sri Lakshmi

Abstract

Existing SQL aggregations have limitations to prepare data sets because they return one column per aggregated group using group functions. A new class of extended aggregate functions was introduced called horizontal aggregations which help preparing data sets for data mining and OLAP cube exploration. Earlier simple, yet powerful, methods (CASE, PIVOT, and SPJ) to generate aggregated columns in a horizontal tabular layout were developed. The first one (SPJ) relies on standard relational operators. The second one (CASE) relies on the SQL CASE construct. The third (PIVOT) Uses a built in operator in a commercial DBMS that is not widely available. The SPJ method is important from a theoretical point of view because it is based on select, project and joins (SPJ) queries. Both CASE and PIVOT evaluation methods are significantly faster than the SPJ method. We propose to use a technique called Generalized Projections (GPs) to improve the performance of SPJ method. The proposed technique capture aggregations, group bys, conventional projection with duplicate elimination (distinct), and duplicate preserving. It improves SPJ performance significantly since applying aggregations early in query processing can provide significant performance improvements.
Full Paper

IJCST/34/4/A-1246
   152 Spatial Mining Using Auto Regression Model

P. Ramesh Babu, K. Srinivas

Abstract

Spatial data mining fulfils real needs of many geomantic applications. Many organizations collected large amounts of spatially referenced data in various application areas such as traffic, banking and marketing areas. Mining spatial data is very valuable and knowledgeable for vital strategic decision making. The geographical databases are useful for avoiding the road accidents, vehicle flow and sometimes on the mobility of inhabitants. These data contain useful information for the traffic control in the busiest areas on the roads like administrative areas, schools and market areas. In this paper a study is made for identifying and predicting the accident risk of the road. The previous article written on decision tree techniques invents the mining accident data and the details of corresponding road sections. Using the accident data, combined to trend data relating to the road network, the traffic flow, population, buildings etc. The existing work used the approach of multilayer spatial data mining, i.e. each layer is combined with another layer dataset using spatial criteria, then applying a standard method to build a decision tree. We propose a new method called spatial auto-regression model. It is a popular spatial data mining technique which has been used in many applications with geo-spatial datasets.
Full Paper

IJCST/34/4/A-1247
   153 A Combinatorial Morphological Algorithm to Smooth Images

Dr. G. Samuel Vara Prasad Raju, K. Yogeswara Rao, D. Siva Phanindra

Abstract

This paper provides an image smoothing algorithm that uses a combination of two strategies-one genetic algorithm, and two ant-colony algorithm. Both these algorithms can be combined to provide a more effective approach to image smoothing. However, this paper doesn’t use the two algorithms exactly, but uses the idea of their problem solving process to come at a better solution.
Full Paper

IJCST/34/4/A-1248
   154 Data Obfuscation for Critical Solutions

D. V. Harish, K. R. Pradeep, Sridhar. K. M, Gangadhar. M. L

Abstract

Data masking is the process of obscuring (masking) specific data elements within data stores. It ensures that sensitive data is replaced with realistic but not real data. The goal is that sensitive customer information is not available outside of the authorized environment. Data masking is typically done while provisioning non-production environments so that copies created to support test and development processes are not exposing sensitive information and thus avoiding risks of leaking. Masking algorithms are designed to be repeatable so referential integrity is maintained. While organizations typically have strict controls on production systems, data security in non-production instances is often left up to trusting the employee, with potentially disastrous results. Creating test and development copies in an automated process reduces the exposure of sensitive data. Database layout often changes, it is useful to maintain a list of sensitive columns in a without rewriting application code. Data masking is an effective strategy in reducing the risk of data exposure from inside and outside of an organization and should be considered a best practice for curing non-production databases. No literature found on the application of data masking techniques for data warehouse testing applications which are business critical. Hence, hereby a model is proposed which can be uniformly used across the industry for testing data which are business critical.
Full Paper

IJCST/34/4/A-1249
   155 Comparison of Parametric and Non-Parametric Models in Human Pose Assessment

Dammalapati Neelima, Gangina Sridevi

Abstract

Human pose estimation model the functional mapping, or conditional distribution, between image features and 3D pose Learning such multi- models in high dimensional spaces. To address these issues latent variable models have been Proposed. Shared LVMs attempt to learn a coherent, typically non-linear, latent space shared by image features and 3D poses, distribution of data in that latent space, and conditional distributions to and from this latent space to carry out inference. Discovering the shared manifold structure. A framework that addresses this shortcoming,latent spaces, and distributions for image features and 3D poses separately first, and then learn a multi-modal conditional density between these two low- dimensional spaces in the form of Gaussian Mixture Regression. Using our model we can address the issue of over fitting and generalization.
Full Paper

IJCST/34/4/A-1250
   156 Perturbation Approach for Protecting Data Server used for Decision Tree Mining

T. Nirosh Kumar, G. Jaya Raju

Abstract

Data Mining is the step by step process for extracting interesting rules from large amount of data. The data can be stored at database server, file, data warehouse, and the data servers must be protected from an authenticated person access. Decision tree mining is one of the classification algorithms that construct rules from centralized data set. This paper gives description for how to protect data in the server used for decision tree mining and description of perturbation privacy preserving algorithm. It constructs two data sets unrealized equivalent original data set. Here the main idea is the decision tree derived from original data set same as the decision tree derived from unrealized data set. The experiment results shows that approach out performs the other techniques
Full Paper

IJCST/34/4/A-1251
   157 Mutual Authentication-robust user Authentication for Online Services

V. Uma Maheswari, S. Anitha Reddy

Abstract

The purpose of this paper is to present an improvement of the Needham-Schroder public key protocol. This new protocol will use partial quotients issue from the continued fraction expansion of some irrational numbers to secure the authentication between two principals. We introduce a new approach in the use of pseudorandom numbers, because besides using these numbers to provide uniqueness and timeliness guarantees, we use them to ensure that nobody can guess the identity of the sender. We also keep this new protocol secure against the Lowe attack, without taking the solution suggested by Lowe. This protocol remains fast although we compute some partial quotients during the authentication process.
Full Paper

IJCST/34/4/A-1252
   158 Mining of Sequential Patterns with Constraint in Large Databases

Viswanadhuni Kishore, V. Sambasiva Rao, Rachapalli Rajaiah

Abstract

Constraint-based mining of sequential patterns is an active research area motivated by many application domains. In practice, the real sequence datasets can present consecutive repetitions of symbols (e.g., DNA sequences, discretized stock market data) that can lead to a very important consumption of resources during the extraction of patterns that can turn even efficient algorithms to become unusable. In this paper, we investigate this issue and point out that the framework developed for constrained frequent-pattern mining does not fit our missions well. An extended framework is developed based on a sequential pattern growth methodology. Our study shows that constraints can be effectively and efficiently pushed deep into sequential pattern mining under this new framework.
Full Paper

IJCST/34/4/A-1253
   159 Towards Creating a RFID Authentication Enabled Secure Group Communication Plane

K S Jagadeesh, Dr. Somashekhar C Desai, Chandramouli. H, Kashyap D Dhruve

Abstract

Securing data transactions in distributed networks to support reliable group communications is always desired. This realization of a RFID authentication enabled Secure Group Communication Plane (RASCP) is discussed in this paper. RFID tags are commonly used for the purpose of identification. The RASCP proposed uses the RFID tags for group member identification and protocol initialization. The communications amongst the group members considered are secured using the commutative RSA cryptographic algorithm. Researchers more often than not have predominantly concentrated on securing the prevalent RFID technology. The RASCP proposed adopts the positives from the existent RFID technology and incorporates these features to construct secure group communication planes. The proposed is compared with the existing secure group communication protocol. The overcomes the drawbacks of key exchange, key storage, key distribution and the need of external servers to facilitate key management functions that commonly exist in the existing group communication protocols.
Full Paper

IJCST/34/4/A-1254
   160 Optimal Routing for Delay Tolerant Network

Mahaboob Subhani Shaik CH, M.V. Rama Krishna

Abstract

Routing in Delay Tolerant Networks (DTN) is a challenging problem because at any given time instance, the probability that there is an end-to-end path from a source to a destination is low. So, ensuring stable end-to-end transmissions is very challenging in sporadic network environments such as Delay Tolerant Networks (DTN) where future node connections are uncertain. Since the future node connections are mostly unknown in these networks, opportunistic forwarding is used to deliver messages. Inspired by human mobility traces, previous opportunistic forwarding technique that uses conventional intermeeting time to deliver messages was replaced by Conditional Shortest Path Routing (CSPR) protocol that uses conditional intermeeting time as the link metric that ensures higher delivery rate and lower end-to-end delay compared to the shortest path based routing protocols. By requiring a priori connectivity knowledge, appear infeasible for a self-configuring network. In this paper, we present a routing protocol that only uses observed information about the network. We designed a metric that estimates how long a message will have to wait before it can be transferred to the next hop. When connections are established then routing is recomputed. Later, Messages are exchanged if the topology suggests that a connected node is “closer” than the current node.
Full Paper

IJCST/34/4/A-1255
   161 Understanding Shared Technology in Cloud Computing and Data Recovery Vulnerability

Tribikram Pradhan, Mukesh Patidar, Keerthi Reddy, Asha A

Abstract

Cloud computing is a virtualization technology which uses sharing the pool of computer resources. Cloud is an emerging technology which combines the features of traditional computing technology with networking technology like parallel, distributed or grid computing. Cloud computing is a new version of internet evolution which can handle large number of customers at a time by sharing the resources over internet. The data owner can remotely store their data in cloud and enjoy the cloud characteristics like ondemand self service, resource pooling, rapid elasticity, ubiquitous network access, rapid elasticity, measured service. Vulnerability is an important factor of risk in cloud computing which is exploited by threat causing harm to system. Cloud vulnerabilities include unauthorized access to management interface, internet protocol vulnerabilities, data recovery vulnerability and metering and billing evasion. Management interface is required for cloud computing on demand characteristics. Unauthorized access to management interface is an issue. The cloud services are accessed over internet using the standard protocol which is entrusted so the network vulnerability is relevant to cloud computing. In cloud we are enjoying pay-peruse that is the service we are using are metered. In this paper is focusing on data recovery vulnerability which uses the cloud characteristics of resource pooling and elasticity of resources. In this the resource allocated to one user is reallocated to different user in later time. The solution we are presenting here is giving strong access control, authentication to administrative access and operations, conducting scanning for vulnerability, complete deletion of user’s data after usage, evaluating the unauthorized environment, make a strong service level agreement for vulnerability remediation, strongly encrypting the data.
Full Paper

IJCST/34/4/A-1256
   162 A New Efficient Approach for Removal of High Density Salt and Pepper Noise in Videos Using Modified Decision Based Unsymmetric Trimmed Median Filter

M. Jyothi

Abstract

A new and efficient algorithm for high-density salt and pepper noise removal in videos is proposed. The proposed algorithm uses a modified decision based unsymmetric trimmed median filter to remove the high density salt and pepper noise. This filter restores the gray scale, and color images that are highly corrupted by salt and pepper noise. The existing methods are Standard median filter (MF), Decision based algorithm (DBA), Modi?ed decision based algorithm (MDBA), and Progressive switched median filter (PSMF). The proposed method can removes the high density noise, its computational speed is also higher than existing filters, it will take the feedback from noisy or corrupted pixels. The proposed algorithm is tested against Different gray scale and color images and it gives better Peak Signal-to-Noise Ratio (PSNR) and Image Enhancement Factor (IEF). Results of the algorithm is compared with various existing algorithms and it is proved that the new method has better visual appearance and quantitative measures at higher noise densities as high as 90%.
Full Paper

IJCST/34/4/A-1257
   163 Survey on Challenges and Threats in Cloud Computing

M. A. Lakshmi, B. Sumalatha

Abstract

Cloud computing model provides ubiquitous, on-demand access to computing resources like networks, servers, applications and storage. As cloud computing becomes prevalent, more sensitive information is being centralized into the cloud such as personal health records, corporate intellectual property, government documents. Along with many benefits, Cloud computing brings critical challenges and security threats. This paper discusses various challenges such as security threats, scalability, availability, authentication and compliance and legal issues.
Full Paper

IJCST/34/4/A-1258
   164 Position Based Routing in Mobile Ad-Hoc Networks: An Overview

Simardeep Kaur, Anuj K. Gupta

Abstract

The mobile ad hoc networks are used now days due its several advantages over the other networks. Several protocols are used for wireless ad hoc networks such as reactive protocols proactive protocols for finding routes. Hybrid protocols are used which combines the advantages of both reactive and proactive protocols. The position based routing protocols are also the part of hybrid routing protocol. The position based routing protocols uses GPS to find the availability of routes. In this paper various protocols for position based routing in mobile ad hoc networks are discussed.
Full Paper

IJCST/34/4/A-1259
   165 Associating Anti-Spam Method in a Secure Overlay Cloud Network

K. Phalguna Rao, Dr. Vinay Chavan

Abstract

Cloud computing is the newest term for the ong-dreamed vision of computing as a utility. The cloud provides convenient, on-demand network access to a centralized pool of configurable computing resources that can be rapidly deployed with great efficiency and minimal management overhead. The industry leaders and customers have wide-ranging expectations for cloud computing in which security concerns remain a major aspect. Security services such as Intrusion Detection System (IDS), Anti-Virus software, Anti-Spam software and Distributed Denial of Service (DDoS) are used in general cloud-based security overlay network that acts as a transparent overlay network. We focus and analyze various Anti-Spam methods and have a comparative study among them to show their efficiency of providing security in the cloud overlay network.
Full Paper

IJCST/34/4/A-1260
   166 Study on Efficiency of Predictive Prefetching and Caching Algorithms

Dr. G. Arumugam, Dr. S. Suguna

Abstract

World Wide Web is an important area for data mining research due to the huge amount of information. The success of the WWW depends on response time. Predictive prefetching is an important technique to reduce latency. To predict the user request, millions of web logs from server side need to be analyzed. Identification of user session boundaries is one of the most important processes for predictive prefetching of user next request based on their navigation behavior. In this paper user session boundaries are identified using IPaddress, browsing agent, and then by considering intersession and intrasession timeouts, and immediate link analysis. A complete set of user session sequences and the learning graph based on these user session sequences is also generated. We note that all the works ignored some of the following important issues for the prediction. They are Analysis of non-prefetchable items, prefetching objects that are newly created or never visited before, Analysis of aging factor, Document size to be cached and cache utilization factors, and Analysis of document duplication process. This paper proposed the algorithm that prefetches the objects based on all the above factors except the document duplication problem. The survey indicates that GDSF based Predictive Web Caching (NGRAM) and keyword based semantic prefetching with LRU (KBSP) methods outperforms than the existing methods.So, in this study NGRAM and KBSP methods performances are compared against the proposed algorithms. The performance metrics in our experimental study are prefetching Hit ratio, Byte hit ratio, and Waste ratio for different cache sizes.
Full Paper

IJCST/34/4/A-1261
   167 Conception of E-Governance Using Phased Approach

Saman Mujtava, Dr. Prashant Kumar Pandey

Abstract

The significance of Information and Communication Technology cannot be overstressed, with ICT affecting all aspects of life such as education, entertainment, and the Internet. This paper deals with the steps and phases of implementing ideal eGovernment. The obstacles that can be expected during the realization of eGovernance are discussed along with the policy to address these barriers This work provides details of a phased approach towards realizing complete eGovernance.
Full Paper

IJCST/34/4/A-1262
   168 Segmentation and Classification of Land use Land Cover Change Detection Using Remotely Sensed Data for Coimbatore District, India

Dr. K. Thanushkodi, Y. Baby Kalpana, M. Sharrath

Abstract

Land Use is clearly constrained by environmental factors like soil characteristics, climatic conditions, water sources and vegetation. Changes in Land Use and Land Cover is a dynamic process taking place on the earth surface, and the spatial distribution of the changes that have taken place over a period of time and space is of immense importance in many natural studies. Whether regional or local in scope, remote sensing offers a means of acquiring and presenting land cover data in timely manner. The environmental factors reflect the importance of land as a key and finite resource for most human activities including agriculture, industry, forestry, energy production, settlement, recreation and water sources and storage. Often improper land use is causing various forms of environmental humiliation. For sustainable utilization of the land ecosystems, it is essential to know the natural characteristics, extent and location, its quality, productivity, suitability and limitations of various land uses. Land use/Land cover change has become an important component in current strategies for managing natural resources and monitoring environmental changes. The advancement in the concept of vegetation of the spread and health of the world’s forest, grassland and agricultural resources has become an important priority. Viewing the earth from space is now crucial to the understanding of the influence of man’s activities on his natural resource base over time. Over past years, data from Earth sensing satellites (digital imagery) has become vital in mapping the Earth’s features and infrastructures, managing natural resources and studying environmental change.
Full Paper

IJCST/34/4/A-1263
   169 Design and Analysis of a Multithreaded Web Archiving System for Offline Browsing Using Graph Searching Algorithms

Dr. B. Vijaya Babu, P. Sunitha

Abstract

Web Archiving (or mirroring) is a technique aimed at downloading both static and dynamic web pages of a website successfully on to a user specified location and these downloaded pages can be browsed at a later time during the offline conditions. This process re-creates the entire web site as the mirror of the original web site at user specified location. In this paper, a multithreaded web archiving system is designed using breadth first, depth first, best first and A* best first graph search algorithms. This system not only performs the task of digital preservation of the websites but also facilitates the offline browsing of the websites. Various case studies have been conducted and are analyzed to evaluate the performance of the web archiving system in terms of offline browsing efficiency and running time. These parameters are estimated in the presence of graph searching algorithms and are compared. The performance of the web archiving system is also briefly compared with some of the existing related applications.
Full Paper

IJCST/34/4/A-1264
   170 Professional Education in Different Institutions

Gaurav Jindal, Dr. J K Tyagi

Abstract

In the contemporary world, professional education is a growing field. We have a few established institutes and an ample amount of new institutes are fostering day by day. There seems to be a big gap lying between these two types of institutions in many ways. Though both type of institutes provide same type of degree(s), eventually, many differences remains there. Mainly, one degree is accepted everywhere and the other don’t have a shine. Needless to say, one course done by two students from different institutes providing same degree puts them in to different spheres. This paper is aimed to find the disparities and trying to accelerate the thinking process to capture the attention of reader towards the requirement of fast act requisite for finding the solution.
Full Paper

IJCST/34/4/A-1265
   171 Detection of Reliable Software Using SPRT & Pareto Type II SRGM

Dr. R. Satya Prasad, N Geetha Rani, Dr. R. R. L. Kantam

Abstract

Classical Hypothesis Testing is performed with volumes of data without analysis. Nevertheless, in Sequential Analysis there should have been examination directly over each and every problem once they are accumulated. Sequential Analysis approach concedes one to infer and it can be possible to reach a final conclusion in an advance stage. The procedure, which is adopted for this is Sequential Probability Ratio test (SPRT). This paper presentation shows how to use the Wald’s Sequential Probability Ratio Test (SPRT) to determine the reliability/Unreliability of software products. The well known SPRT procedure is adopted for the Pareto type II software reliability growth model (SRGM), which is based on the Non Homogenous Poisson Process (NHPP). The performance of the proposed Pareto Type II model is demonstrated by using 6 data sets.
Full Paper

IJCST/34/4/A-1266
   172 Public Cryptography Linear Programming Solver in Cloud Computing

S. A. Phatangare, Dr. P. K. Deshmukh, R. A. Deshmukh

Abstract

Public-key cryptography refers to a cryptographic system requiring two separate keys, one of which is secret and one of which is public.Cloud Computing has great potential of providing robust computational power to the society at reduced cost. It enables customers with limited computational resources to outsource their large computation workloads to the cloud, and economically enjoy the massive computational power, bandwidth, storage, and even appropriate software that can be shared in a pay-per-use manner. Despite the tremendous benefits, security is the primary obstacle that prevents the wide adoption of this promising computing model, especially for customers when their confidential data are consumed and produced during the computation. Treating the cloud as an intrinsically insecure computing platform from the viewpoint of the cloud customers, we must design mechanisms that not only protect sensitive information by enabling computations with encrypted data, but also protect customers from malicious behaviors by enabling the validation of the computation result. Focusing on engineering computing and optimization tasks, we investigates secure outsourcing of widely applicable linear programming (LP) computations. In order to achieve practical efficiency, our mechanism design explicitly decomposes the LP computation outsourcing into public LP solvers running on the cloud and private LP parameters owned by the customer. To validate the computation result, we explore the fundamental duality theorem of LP computation and derive the necessary and sufficient conditions that correct result must satisfy. In the proposed algorithm, the robustness preference of the output given by the server is checked by the client, whether the server is given the correct output. Whether the cloud is giving the correct result we also try to device the robust algorithm for numerical stability.
Full Paper

IJCST/34/4/A-1267
   173 Minimizing Application Layer DDOS Attacks Using Website Customization

P. Ram Mohan Rao, Dr. K. Venkateswara Reddy, S. V. Hemanth

Abstract

DDOS attacks are continuous and critical threat to Internet especially commercial websites. The Application Layer based attacks use genuine HTTP requests and creates devastating effect on the computing resources and in turn results into denial of service. An anomaly detector based on Hidden Markov Model (HMM) can be used to detect the attack. The model needs to be trained by using complex algorithms which may affect response time significantly. Some useful customizations during website design can minimize the application layer DDOS attacks.
Full Paper

IJCST/34/4/A-1268
   174 Predicting User Behavior and Comparison in Sequential Data Mining Techniques

Ankit Kumar, Bhasker Pant

Abstract

Web log file contains the huge hidden valuable information pertaining to the visitors, if mined can be used for predicting the navigation behavior of the users. The web log files are generated as a result of an interaction between the client and the service provider on web. However the task of discovering frequent sequence patterns from the web log is challenging. Sequential pattern mining provides a significant role in serving a promising approach of the access behavior of the user. This paper focuses on adopting an intelligent technique that can provide personalized web service for accessing related web pages more efficiently and effectively, so that it can be determined which web pages are more likely to be accessed by the user in future. This paper uses three intelligent algorithms for predicting the user behavior’s namely Apriori and Eclat and fp growth also does the performance comparison of the two algorithms in terms of time and space complexity for the filtered data.
Full Paper

IJCST/34/4/A-1269
   175 Comparison and Implementation of Biometric Inspired Digital Image Steganography

Rupa Maan, Lovneesh Bansal

Abstract

The recent progress in the digital multimedia technologies has offered many facilities in the transmission, reproduction and manipulation of data. However, this advancement has also brought the challenge such as copyright protection for content providers. In the present work, two techniques namely LSB and Biometric pattern (Skin tone) based steganography have been compared and result were evaluated on the basis of comparison of quality of image after embedding of data and change in PSNR value of the image. This project is developed for hiding and store information in any image file by using biometric pattern. Biometric Steganography is presented that uses skin region of images in DWT domain for embedding secret data. This work also showed the results for skin detection which performed pretty well in this case for identifying the skin regions from the given image. By embedding data in only certain region (here skin region) and not in whole image security is enhanced. To evaluate quality of stego image after embedding the secret message, Peak Signal to Noise Ratio (PSNR) has been used. The performance in terms of capacity and PSNR (in dB) is demonstrated for this method. From above results it is observed that this preserves histogram of DWT coefficients after embedding also. That is the main reason why there is not much visual difference between the cover image and embedded image. Performing biometric steganography offers respectable level of security. By adopting embedding algorithm that preserves histogram of DWT coefficients after embedding, prevents histogram based attacks.
Full Paper

IJCST/34/4/A-1270
   176 Calculating Grid Reliability in Bayesian Networks

Ashwani Malhotra, Rakesh Gagneja

Abstract

Grid computing has emerged as an important field for complex systems with need of sharing of resources in a large scale environment, wide area communication and multiinstitutional collaboration. Grid computing is used where there is a need of large amount. Both grid and distributed environments are different from each other in context of large scale resource sharing in grid. With the increase in the size of these systems the need to evaluate their reliability is increasing day by day. Hence various methods for estimating reliability are being developed. Bayesian Networks (BN) can also be used to estimate grid reliability. BN is used to depict the interactions between the various components of the distributed system and show the relationship between them. The BN depicts the probabilistic associations between the system components in a very simple and easy way. The relationship is represented using directed acyclic graph where nodes represent the variables and the links between each pair of nodes represent the causal relationships between the variables. So BN is the best method that can be used to evaluate the reliability of the grid services. This paper discusses the use of Minimal Resource Spanning Trees (MRSTs) and BN in estimating grid service reliability. Also this paper depicts that the Bayesian Networks can help in easy representation of grid networks, hence only historical data is required to construct them. In this paper, the K2 algorithm has been extended to estimate the reliability of the MRST and used to build BN.
Full Paper

IJCST/34/4/A-1271
   177 Honeypot: A Survey

Neha Sahu, Vineet Richhariya

Abstract

Information security is a growing concern today for organizations and individuals alike. This has led to growing interest in more aggressive forms of defense to supplement the existing methods. One of these methods involves the use of honeypots. A honeypot is a security resource whose value lies in being probed, attacked or compromised. It is an outstanding technology that helps us to learn new hacking techniques from attackers and intruders. The much information we collect from multiple Honeypot servers, the more exact attack patterns we can generate and we can also try to find out the location of attacker.In this paper we present an overview of honeypots and provide a starting point for persons who are interested in this technology. We examine different kinds of honeypots, honeypot concepts, and approaches in order to determine how we can recommend measures to enhance security using these technologies.
Full Paper

IJCST/34/4/A-1272
   178 A Fast Retrieval of Software Reusable Components Using Bit Map Index

P. Niranjan, M. Venugopal Reddy, P. Shireesha

Abstract

By the component libraries scaling up and the reuse practice deepening, the efficiency management of component repository is the precondition of component reuse. It is favorable for component reuse that components are organized in library according to felicitous classification. On the basis of introducing the faceted classification, this paper presents a component classification model based on facet and bit map indexing, and experiment prove it in effect. The input is the attributes of the component; in this paper we are using three Attributes: language, Operating System, File Type. Based on these attributes we perform bit map indexing and retrieve the required component from the component library.
Full Paper

IJCST/34/4/A-1273
   179 Mining of Frequent OptimisticEstimations by Using Measured Techniques

M. Lavanya, Dr. M. Usha Rani

Abstract

In recent years the sizes of databases has increased rapidly. This has led toa growing interest in the development of tools capable in the automatic extractionof knowledge from data. The term Data Mining, or Knowledge Discovery inDatabases, has been adopted for a field of research dealing with the automaticdiscovery of implicit information or knowledge within databases.Several efficient algorithms have been proposed for finding frequentitemsets and the association rules are derived from the frequent itemsets, such as theApriori algorithm. These Apriori-like algorithms suffer from the coststo handle a huge number of candidate sets and scan the database repeatedly. A frequent pattern tree (FP-tree) structure for storing compressed and criticalinformation about frequent patterns is developed for finding the complete set of frequent itemsets. But this approachavoids the costly generation of a large number of candidate sets and repeated databasescans,which is regarded as the most efficient strategy for mining frequent itemsets.Finding of infrequent items gives the positive feed back to the ProductionManager. In this paper, we are finding frequent and infrequent itemsets by taking opinions of different customers by using Dissimilarity Matrix between frequent and infrequent items and also by using Binary Variable technique. We also exclusively use AND Gate Logic function for finding opinions of frequent and infrequent items. After finding frequent and infrequent items the apply Classification Based on Associations (CBA) on them to have better classification.
Full Paper

IJCST/34/4/A-1274
   180 Technology in Education

Barjinder Kaur

Abstract

Technology education is the study of human innovation, which provides an opportunity for students to apply and manage knowledge and resources related to the human made world. It incorporates collaborative, application-oriented, activitybased strategies used to develop creative thinking skills while solving real-world problems. The study of technology education prepares students to become lifelong contributing members of our technological society who comprehend the impact of technology and use it to improve the quality of life for all people.
Full Paper

IJCST/34/4/A-1275
   181 AGILE: The Innovation with Intelligence

Jayanthan. R, R. Gowtham, Praveen T, Sivakumar P, Sankar. G

Abstract

“AGILE” is the most powerful project model which tends to grouping of cross functional team workforce and also the continuous integration of the process which over and done with the successful project delivery. We was following the traditional waterfall model and V-shape in our organization and once we started a project with AGILE where this is a project with 4 sprint and total of 30 days for project deadline. In this model the development and Quality Control (QA) team will work together and in scrum call the estimation was did based on various resources around the geographic locations, so the planning will made in accurate level for the development because the estimation is done by multiple brains and finalized after discussions and also when the user develops with programming the QA team will prepare the necessary test cases so which can be shared with the developers so which may discard the issues while coding. In the QA team the Tester will get a chance to test the entire nook and corner of the functionality areas since each ser stories are small so everyone get a fair knowledge on business level, so at the nd the code delivered will be error free and perfect. The scrum call on daily basis helps eryone in the team to get pdate and fair knowledge of status, so if any one part face any locks or huddles which can be owned by the expert of the team and completed on time. This ethod helped us a lot for the proper delivery of the project on time with 100% Accuracy and functionality as expected.
Full Paper

IJCST/34/4/A-1276
   182 Adaptive Fuzzy Search Over Encrypted Data in Cloud Computing

Ranjeeth Kumar. M, S.S.V.N. Sarma

Abstract

In cloud computing environment data security is an on-going challenging task hence the sensitive data has to be encrypted before outsourcing. In existing technique we retrieve the files from the cloud, by searching the keywords on the encrypted data [3]. They are many searching technique which were implemented in the cloud but the disadvantages with these technique supports only exact keyword search [2]. Typical users searching behaviors’ are happen very frequently these are the drawbacks with the existing system which are not suitable for cloud computing environment and which effects system usability [4]. Our proposed work in this paper concentrates on solving the problems of the user who search the data with the help of fuzzy keyword on cloud [1]. We formalize and solve the problems of effective fuzzy keyword search over encrypted cloud data while maintaining keyword privacy [6]. Using adaptive fuzzy search the exact keywords are displayed along with similarity keywords and number of views, which solve the problems faced by the cloud users. Adaptive fuzzy search is a new direction of research with in area of adaptive fuzzy search over encrypted data in cloud computing and user model-based interfaces. These systems build a model of the individual user and apply it to user adaptation. We show that our proposed solution is secure and privacy preserving [5], while correctly realizing the goal of fuzzy keyword search.
Full Paper

IJCST/34/4/A-1277
   183 A New Approach in WSN for Secure Data Aggregation Using ANN

Gayatri Dagar, Amanpreet Kaur

Abstract

Security is one of the major Concern to achieve the secure communication. When the data is over the network there are more chances of some Active or passive attack. The proposed approach is about to detect the Active attack in the network. It means if some user add some extra information with data packet or destroy some information, the proposed approach can detect such kind of false data packets. We are presenting a neural network based approach to detect the fault in data packets. The proposed approach is nonlinear sensor model, in which nodes are placed dynamically. This approach will combine the concept of data verification and user authentication along with data aggregation. The approach is driven to both the integrity as well as the security to transfer data.
Full Paper

IJCST/34/4/A-1278