IJCST Logo

 

INTERNATIONAL JOURNAL OF COMPUTER SCIENCE & TECHNOLOGY (IJCST)-VOL IV ISSUE II, VER. 2, APR. TO JUNE, 2013


International Journal of Computer Science and Technology Vol. 4 Issue 2, Ver. 2
S.No. Research Topic Paper ID
41 RAID: Characteristics of a Computer Supported Learner Friendly System Based on Mental Models

Yasir Ahmad

Abstract

Computers have been used to assist the learning for over forty years now. Many of the systems which have been developed in the field of computer-aided learning have been very specialized in that they have either been designed solely for a particular course, or they have concentrated on a particular aspect of support, be it communication, provision for practical work, interactive demonstrations, and so on. What required is a Computer Supported Learner Friendly System which is based on mental models of the learners and Reflective; Adaptive; Interactive; and Discursive characteristics which are discussed in this article.
Full Paper

IJCST/42/2/B-1460
42 Improving the Relevance Ranking in Search Engines: An Overview

Nidhi Saxena, Dr. Vivek Chandra

Abstract

The World Wide Web is a rich source of information and it continues to expand in size and complexity. Because of the volume and intricacy of the web content, retrieving of the required web page on the web, efficiently and effectively has become a challenge for the search Engines. This paper attempts to detail the techniques adopted by the search engines like Page Ranking and HITS for determining the relevance ranking of the web content pertinent to the given query along with the issues encountered by them. Further new techniques like personalized search, real time search and mobile search introduced by popular search engines like Google to improve the relevance ranking so as to achieve a quality search have also been discussed at length.
Full Paper

IJCST/42/2/B-1461
43 A Comparative Study of Failure Data for Software Reliability Estimation

S. Charles Ilayaraja, Dr. M.Ganaga Durga

Abstract

Software reliability is one of the attributes of quality and its measurement is supported by Software Reliability Growth Models. Among many of the proposed SRGMs, some of the models have been widely used and few of them obsolete. The successful SRGMs are characterized by the accuracy of reliability estimation. In such models, the reliability estimation depends on the quality of failure data and its accuracy increases as the length of failure observation increases. In this paper, we have studied the role of failure data set in the reliability estimation by employing three different data sets on unified SRGM that has no distinction between failure observation and fault removal process. The selected SRGM is incorporated with two distribution functions one is exponential and another one is 2-staged erlang distribution. For easy implementation, we assume that the debugging process is perfect. That is the probability of error correction is one and no errors have been introduced during the debugging process. Three different cases of failure data are considered for our study such as database application software, web server, the interface of an operating system. The unknown parameters of each case are estimated by using SMERFS tool and the goodness-of-fit analysis is done in MATLAB environment.
Full Paper

IJCST/42/2/B-1462
44 Approaches to the Travelling Salesman Problem Using Genetic Algorithm Approaches

Ankur Sharma, Ratan Mishra, Vijay Maheshwari

Abstract

Travelling Salesman Problem (TSP) is an example of NP hard combinatorial optimization problem. In TSP salesman travels N cities & returns to the starting city with minimal cost, he is not allowed to cross the city more than once. To solve this problem we use Genetic Algorithm Approach as genetic algorithms are designed to solve NP hard problems. In this paper The Genetic Algorithm is used for initialization of population and improvement of offspring produced by crossover for a Traveling Salesman Problem (TSP). various solutions are considered during the search procedure, and the population evolves until a solution satisfies the termination criteria. The advantages of the GA over other heuristic methods for solving combinatorial optimization problems include massive parallelism and its ability to solve “non-linear” optimization problems where the search space is extremely large.
Full Paper

IJCST/42/2/B-1463
45 A Study of Cloud Computing

Beant Kaur

Abstract

Cloud computing is an increasingly popular paradigm for accessing computing resources and which aims to provide customized and QoS guaranteed dynamic computing environment for end- user. Since the cloud is broad collection of services, organization can choose where, when and how they are use cloud computing. In this paper we study the cloud computing concepts such as definition, models, characteristics, disadvantages, conclusion and future work.
Full Paper

IJCST/42/2/B-1464
46 Dual Batch System in Streaming Data Warehouses

Silambarasan. S, Kalimuthu. M

Abstract

Streaming data in system where scheduling of data warehouse with fast change of online content makes the revolution and update of data. In existing scheme the relation of modernize and proportional algorithm does not support multi way join since it support single way join in which update of data is unhurried. Multiple on-line update makes the data query extraction speedy and well-organized. For more competent and more flexibility in scheduling and update in data warehouse Dual Batch is introduced. This minimizes the usage of system resource and sustain multi way join in which very large number of data update and update is fast and time decline. With dual batch system it’s possible to reduce the time and increase the up to date speed and scheduling of data warehouse. This makes the on line update of data and taking out of data fast and efficient. Experimental simulations are carried out to evaluate the performance of future optimization technique with both mock and real data sets in terms of carrying out time, and operational overhead being used to get optimal threshold for multi-relational joins generation
Full Paper

IJCST/42/2/B-1465
47 A Comparative Study of OLTP and OLAP Technologies

Jatinderpal Singh, Anshul Sood

Abstract

This paper provides an overview of Data warehousing and comparative study of OLAP, OLTP technologies. The data warehouse supports On-Line Analytical Processing (OLAP) who’s functional and performance requirements are different from those of the On-Line Transaction Processing (OLTP) applications supported by the operational databases. Data warehousing and On-Line Analytical Processing (OLAP) are essential elements of decision support. OLTP is a class of program that facilitates and manages transaction-oriented applications. An OLAP system is used for data analysis by knowledge workers, including managers, business executives and market analysts. Data warehousing and OLAP have emerged as leading technologies that facilitate data storage, organization and retrieval.
Full Paper

IJCST/42/2/B-1466
48 Context Awareness through Cross-Layer Network Architecture

Murthy D.H.R., Ramesh M.Badiger

Abstract

Layered architectures are not sufficiently flexible to cope with the dynamics of wireless-dominated next generation communications. Cross-layer approaches may provide a better solution: allowing interactions between two or more nonadjacent layers in the protocol stack. Cross-layer architectures based on purely local information will not be able to support system-wide cross-layer performance optimization, context awareness, etc. A new crosslayer architecture which provides a hybrid local and global view, using gossiping to maintain consistency. This paper includes the possibilities of context-awareness in communications through this architecture by two examples. The first example uses user centric context to control the available link-bandwidth and satisfy user accordingly. The second uses contextual information to control the transmission power of a mobile node.
Full Paper

IJCST/42/2/B-1467
49 Fuzzy Logic Based Control System for Washing Machines

Deepak Kumar, Yousuf Haider

Abstract

In the Indian household, washing machines are a common feature today. The most important utility that can be derived from washing machine is that, effort can be saved what had to put in brushing, agitating and washing different types of clothes who need different amount of washing time which depends directly on the type of dirt, amount of dirt and cloth quantity etc.The washing machines that are used today serves all the purpose of washing but which cloth need what amount of agitation time is an important aspect. The work present in this paper describes the procedure that can be used to get a suitable washing time for different clothes with the help of fuzzy logic control.The procedure is based on the principle of taking inputs from sensors subjecting them to the fuzzy arithmetic and obtaining a crisp value of washing time.
Full Paper

IJCST/42/2/B-1468
50 An Adaptive Application Security to High-Speed Mobile Devices

Ronda Madhu S R

Abstract

The security of mobile smart phones is matter of concern in the fast spreading world of network communications. Different threats to high speed phones are listed detailing the most common forms of attacks that exploit such common features like SMS, MMS, Wi-Fi and GSM networks. Attacks based on vulnerabilities in applications is presented taking note of flaws in web browsers, operating systems. Application security is detailed with various runtime and design time security permission assignment schemes under various scenarios. Smart phone application risks are brought to light and collaborative service model of emerging new applications is mentioned to say that stand alone applications may soon eclipse to highlight the point that security is major concern today in mobile industry.
Full Paper

IJCST/42/2/B-1469
51 Authentication and Integrity Protection for DNS Data Using DSA Algorithm

Shallu Singh, Sheela Verma

Abstract

The mapping or binding of IP addresses to host names became a major problem in the rapidly growing Internet and the higher level binding effort went through different stages of development up to the currently used Domain Name System (DNS).The DNS Security is designed to provide security by combining the concept of both the Digital Signature and Asymmetric key (Public key) Cryptography. Here the Public key is send instead of Private Key. The DNS security uses Message Digest Algorithm to compress the Message (text file) and PRNG (Pseudo Random Number Generator) Algorithm for generating Public and Private Key. The message combines with the Private Key to form a Signature using DSA Algorithm, which is send along with the Public key. The receiver uses the Public key and DSA Algorithm to form a Signature. If this Signature matches with the Signature of the message received, the message is Decrypted and read else discarded.
Full Paper

IJCST/42/2/B-1470
52 Genetic Algorithm Approaches for CPU Scheduling in Operating System

Manu Sharma, Preeti Sindhwani, Vijay Maheshwari

Abstract

To increase the performance of CPU and to get maximum throughput, there are many scheduling approaches which are been used in terms of scheduling the processes, waiting in the ready queue for their chances to get processed. This paper presents the genetic algorithm approach to provide the near to optimal solution for scheduling the processes. CPU scheduling is an NP hard problem. Genetic algorithm provides the optimal solution to these NP hard problems. These algorithms handle a population of possible solutions and the selection of these solutions is based on the fitness function. Crossover & other operators are used to generate the fittest population. This paper compares the traditional CPU scheduling algorithm approaches according to their average waiting time with GA based scheduling algorithm to find the optimal solution
Full Paper

IJCST/42/2/B-1471
53 Efficient Image Segmentation Using Watershed Transform

Niket Amoda, Ramesh K Kulkarni

Abstract

The k-means clustering algorithm is very fast and simple to implement, but it provides only coarse image segmentation. Better retrieval results can be expected by employing a more sophisticated segmentation technique. For this purpose, a novel Texture Gradient based Watershed Segmentation technique is developed. The Watershed Transform is a well established tool for the segmentation of images. However, it is often not effective for textured image regions that are perceptually homogeneous. In order to properly segment such regions the concept of the Texture Gradient is introduced and is implemented using a Non Decimated Wavelet Packet Transform. A marker location algorithm is subsequently used to locate significant homogeneous textured or non textured regions. A marker driven Watershed Transform is then used to properly segment the identified regions. The experimental results demonstrate the superiority of this technique over k-means clustering.
Full Paper

IJCST/42/2/B-1472
54 Cloud Computing: Providing Services Using Virtualization Technique

Dheeraj Sharma, Avinav Pathak

Abstract

Cloud Computing is a service that assigns virtualized resources picked from a large-scale resource pool, which consists of distributed computing resources in a Cloud Computing Infrastructure, to each consumer. Cloud Computing is a fused-type computing paradigm which includes Virtualization, Grid Computing, Utility Computing, Server Based Computing, and Network Computing, rather than an entirely new type of computing technique. Cloud Computing provides service to customers transparently. Users get the application and services they wanted without getting to know how it works. Cloud computing, in current project provides the data handling service to customers by on a large Data uploaded successfully scale commercial use and also ensuring the reliability and the space to the user. The research provides cloud computing based infrastructure service composed of calculation, storage data and accessing it from database which is done also using virtualization for making the resources available easily. Virtualization technology is responsible for facilitating virtual operating system for users on their browser where they can use software services simultaneously. Also for the users perspective this project provides a web service interface to launch, manage, and terminate VMs.
Full Paper

IJCST/42/2/B-1473
55 A Review of Clone Detection Techniques Using Model Semantics

Yachna Arora, Sarita Choudhary

Abstract

A model clone is a set of similar or identical fragments in a model of the system. Understanding and Identifying model clones are important aspects in software evolution. During the Evolution of the Software product Cloning is often a strategic means for the same. Clone detection techniques play an important role in software evolution research where attributes of the same code entity are observed over multiple versions. To successfully create any method or technique for model clones detection we will have to study all the models defined in UML including internal and External Structure of UML This paper reviews some of the techniques available for the Model Clone Prevention and Detection.
Full Paper

IJCST/42/2/B-1474
56 A Novel Technique for Conflict Handling Strategies Using Attribute Value Reconciliation

Arumuga Arun.R, Anbazhagan.K

Abstract

The challenging task present in the data integration process is compromising the attribute value confliction. Data heterogeneity is the main reason for this attribute value conflicts. In this paper, a determination framework is developed to resolve the numerical value conflicts. Instead ad-hoc approach, this framework uses one new efficient systematic approach to handle this attribute value conflicts. This systematic approach considers the consequence of incorrect numerical values and selects a numerical value which minimizes the error cost for various applications.
Full Paper

IJCST/42/2/B-1475
57 Object Serialization Formats and Techniques a Review

Surbhi, Rama Chawla

Abstract

Serialization is a process of converting an object into a stream of data so that it can be easily transmittable over the network or can be continued in a persistent storage location. This storage location can be a physical file, database or Network Stream. This paper concludes some the work that is going on in the field of Object Serialization This paper presents Object Serialization Techniques that can be useful for various purposes, including object serialization Minimization which can be used to decrease the size of Serialized data.
Full Paper

IJCST/42/2/B-1476
58 A Review of NTLM Rainbow Table Generation Techniques

Meetika Malhotra, Bhushan Dua

Abstract

Rainbow tables reduce the difficulty in brute force cracking a single password by creating a large pre-generated data set of hashes from nearly every possible password Rainbow Tables. This method, known as the Faster Time-Memory Trade-Off Technique, is based on research by Martin Hellman & Ronald Rivest done in the early 1980’s on the performance trade-offs between processing time and the memory needed for cryptanalysis. In this paper we review some of the most important works in rainbow table generation and using rainbow tables in window NT environment, i.e. against NTLM. We will discuss how NTLM is weak against rainbow table attacks.
Full Paper

IJCST/42/2/B-1477
59 Comparing the Performance of Grid and Cloud Computing and Enhancing it for Faster Retrieval of Data

Amanpreet Kaur Saini, Gunpreet Singh

Abstract

Cloud computing is an on-demand, pay-as-you-use model that creates a flexible and cost-effective means to access compute resources (services). It is recognized as a revolution in the computing era that developed from the Grid Computing. The service-provider utilizes the virtualization technologies in their systems and customers are charged based upon the amount of resources (such as computers, infrastructures, data storage and application services) used or reserved. Performance monitoring is an integral part of the computing environment. The paper makes an attempt to have a comparative study of the performance of grid and cloud computing and suggest newer methods to enhance the performance for faster retrieval of data.
Full Paper

IJCST/42/2/B-1478
60 Gesture Recognition: A Survey of Gesture Recognition Techniques Using Neural Networks

Mahesh Sharma, Rama Chawla

Abstract

Understanding human motions can be posed as a pattern recognition problem. In order to convey visual messages to a receiver, a human expresses motion patterns. Loosely called gestures, these patterns are variable but distinct and have an associated meaning. The Pattern recognition by any computer or machine can be implemented via various methods such as Hidden Harkov Models, Linear Programming and Neural Networks. Each method has its own advantages and disadvantages, which will be studied separately later on. This paper reviews why using ANNs in particular is better suited for analyzing human motions patterns.
Full Paper

IJCST/42/2/B-1479
61 Diagnosing and Managing Applications Running on a Remote PC via Android Phone

Harsha Daryani, Kavita Dudhagi, Rohinee Deshmukh, Shraddha Inamdar, Archana. S. Kadam

Abstract

This paper presents the design and implementation of a ABRC(Android Based Remote PC Control) i.e. Intelligent Admin (Remote LAN Monitoring and controlling system using Android) is software for managing and monitoring processes. The project is based on the concept of administrating remote machines from a single location. Windows network administrator would like to control certain aspects of machines sitting remotely without having to install and trigger an application on the remote machine which will communicate with administrator machine. This system will work as an Administrator for the LAN which will enable you to remotely monitor and control the machines under the LAN. The mobile user i.e. the Administrator will dial the phone number to which the server is attached. The mobile device then will get connected to the server machine through the Wireless network. The Android Mobile when connected to the Server will display a menu which will be containing the functions that can be performed on Client PC. A single function can be performed at a time, but multiple clients can be controlled simultaneously.
Full Paper

IJCST/42/2/B-1480
62 A Review of Lempel Ziv Compression Techniques

Shalini Kapoor, Ashish Chopra

Abstract

With increasing amount of text data being stored, efficient information retrieval and Storage in the compressed domain has become a major concern. I have studied various Compression algorithms like LZ77 and LZ78. During the study I noticed that the Dictionary based Compression algorithms have several drawbacks. This Paper direct several key issues to the dictionary based LZW algorithm existing today. In contrast to the Previous LZW ,we would like to improve LZW algorithm in future which definitely get good results like: Better compression ratio , time taken for searching in the dictionary for pattern matching in encoding and decoding got reduced and dictionary size become Dynamic. In the future we would like to present modifications possible in existing Lempel Ziv Welch Algorithm by changing its Data structure.
Full Paper

IJCST/42/2/B-1481
63 The Impact of Mobility on the Performance of AODV and DSR Using NCTUns 6.0 Simulator

G. Vijaya Kumar

Abstract

The Mobile Ad Hoc Network (MANET) is a multihop infrastructure less wireless network composed of mobile nodes. Routing is the challenging task in MANET because of dynamic topology, absence of centralized control, power constrained nodes, etc. Ad-hoc Ondemand Distance Vector (AODV) and Dynamic Source Routing (DSR) are the two popular routing protocols of MANET. The AODV and DSR protocols are analyzed using various simulators like NS2, OPNET, QualNet, OMNet, etc. A little work is done towards the analysis of AODV and DSR using NCTUns 6.0 simulator and one important performance parameter, mobility, is not considered. This paper is a study of mobility impact on the performance of AODV and DSR using NCTUns 6.0 simulator.
Full Paper

IJCST/42/2/B-1482
64 A Survey Paper on Linguistic Entropy
Estimation Techniques

Shilpa Rani, Punita Meelu

Abstract

Entropy of a language is a statistical parameter which measures, in a certain sense, how much information is produced on the average for each letter of a text in a language. The amount of information carried in the arrangement of words is the same across all languages, even languages that aren’t related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. This paper is about methods, techniques available and work carried out so far for finding Entropy of Languages.
Full Paper

IJCST/42/2/B-1483
65 Review of Techniques Used for Detection of DOS Attacks

Sukhwinder Singh, Deep Mann

Abstract

In Mobile Ad-Hoc Networks (MANETs), Denial of Service (DoS) attacks not only use the limited system resources like battery energy, CPU cycles or bandwidth, but also isolate genuine users from a network. Therefore, functionality of the Network is effected and this may ultimately undermine the several networking Operations like control and data message delivery. For Secure and Protected Communication between mobile nodes in a hostile environment security is most important. Unlike the wired network, the unique characteristics of mobile ad hoc networks pose a number of nontrivial challenges to security design, such as open peer-topeer network architecture, shared wireless medium, stringent resource constraints, and highly dynamic network topology. Due to these challenges there is a need to build multi layer security solutions that achieve both broad protection and desirable network performance.
Full Paper

IJCST/42/2/B-1484
66 A Review of Artificial Immune System for Network IDS

Ritesh Rana, Punita Meelu

Abstract

This review paper gives a explanation of AIS i.e. artificial immune system. An artificial immune system is a computer an Optimization algorithm, under Bio inspired Algorithms, that mimics some parts of the behavior of the human immune system to protect computer networks from viruses and similar cyber attacks. IDS (Intrusion detection systems) aim at detecting attacks against computer systems and networks in general. The paper introduces the concepts behind AIS and its applications in the intrusion detection in the network.
Full Paper

IJCST/42/2/B-1485
67 Computational Intelligence in the Hepatitis Diagnosis: A Review

Suchitra Kumari

Abstract

Automated diagnosis of diseases has forever been of interest as an interdisciplinary study among computer and medical science researchers. Detection of hepatitis is really a big problem for general practitioners. An expert doctor commonly takes decisions by evaluating the current test results of a patient or by comparing the patient with other patients with the same condition with reference to the previous decisions. Various machine learning and data mining techniques have been widely exploited for the automatic diagnosis of hepatitis. However differences in accuracies of classification by using different- different techniques. Yet, despite high accuracies of up to 96.77% in predicting the correct hepatitis diagnosis. This article aims at sketching out an outline of the wide range of options, recent developments, and potentials in machine learning algorithms in the field of hepatitis diagnosis. A key advance has been the development of a more in-depth understanding and theoretical analysis of critical issues related to algorithmic construction and learning theory. This should make available a good resource for researchers from all backgrounds interested in computational intelligence-based hepatitis diagnosis methods, and allows them to extend their knowledge into this kind of research for better hepatitis diagnosis and may be other diseases.
Full Paper

IJCST/42/2/B-1486
68 CAPTCHA: Attacks and Weaknesses Against OCR Technology

Silky Azad, Kiran Jain

Abstract

The basic challenge in designing these obfuscating CAPTCHAs is to make them easy enough that users are not dissuaded from attempting a solution, yet still too difficult to solve using available computer vision algorithms. As Modern technology grows this gap however becomes thinner and thinner. It is possible to enhance the security of an existing text CAPTCHA by system-atically adding noise and distortion, and arranging characters more tightly. These measures, however, would also make the characters harder for humans to recognize, resulting in a higher error rates and higher Network load .This paper presents few of most active attacks on text CAPTCHAs existing today.
Full Paper

IJCST/42/2/B-1487
69 Comparative Study of Various Image Restoration Techniques on the Basis of Image Quality Assessment Parameters

Raman Kumar, Anil Gupta

Abstract

One of the big challenges in digital photography is motion blur. To remove blur, we need: (i) To estimate how the image is blurred (i.e. the blur kernel or the point-spread function) and (ii) To restore a natural looking image through deconvolution. The blur kernel estimation is challenging because the algorithm needs to distinguish the correct image pair from incorrect ones that can also adequately explain the blurred image. The process of deconvolution is also difficult because the algorithm needs to restore high frequency image contents attenuated by blur. In this paper, we address a few aspects of these challenges. We introduce an insight that a blur kernel can be estimated by analyzing edges in a blurred photograph. We can recover the blur using the inverse radon transform. This method is computationally attractive and is well suited to images with many edges. Blurred edge profiles can also serve as additional cues for existing kernel estimation algorithms. We have introduced a method to integrate this information into a maximum-a-posteriori kernel estimation framework, and show its benefits. In this paper, we compared restored gaussian blurred images, by using four types of techniques of deblurring images such as Wiener filter, Inverse filter, Lucy Richardson deconvolution algorithm and our purposed algorithm on the basis of well-known image quality assessment parameters like mean squared error (MSE) and peak signal-to-noise ratio (PSNR).
Full Paper

IJCST/42/2/B-1488
70 Hepatitis Disease Diagnosis Using Mixture of Expert

Suchitra Kumari

Abstract

Automated diagnosis of diseases has forever been of interest as an interdisciplinary study among computer and medical science researchers. Detection of hepatitis is really a big problem for general practitioners. An expert doctor commonly takes decisions by evaluating the current test results of a patient or by comparing the patient with other patients with the same condition with reference to the previous decisions. In this study, various models were generated by using mixture of experts as a classification method. Further, Model having very good accuracy of 97.37% with least minimum square error was selected for the prediction of disease. This approach can be used for easy diagnosis of hepatitis for a large number of populations by incorporating the profile of more samples in the training stage.
Full Paper

IJCST/42/2/B-1489
71 Cloud Computing Revolution in New Web Technology: Application and Challenges Survey

Lenka Venkata Satyanarayana

Abstract

Cloud computing may be the most innovative technology development in decades and the greatest thing since sliced bread. In case users try to access the service based on need without regard to where it is or how it can be delivered. Experts suggest various computing systems for the needs of these users. Such as cluster computing, grid computing, and cloud computing. After 2007, Cloud computing technology acceptance among users is more than other technologies. It is a technology where software applications, processing power, data and potentially even artificial intelligence are accessed over the internet, or in simple words any situation in which computing [processing] is done in a remote location (out in the clouds), rather than on your desktop or portable device.
Full Paper

IJCST/42/2/B-1490
72 The obligatory of an Algorithm for Matching and Predicting Crime – Using Data Mining Techniques

Anshu Sharma, Raman Kumar

Abstract

This survey paper categorizes, compares, and summarizes the dataset, algorithm and performance measurements in almost all published technical and review articles in automated crime pattern detection. Crime is classically “unpredictable”. It is not necessarily random, but neither does it take place consistently in space and time. A better theoretical understanding is needed to facilitate practical crime prevention solutions that correspond to specific places and times. Crime analysis uses past crime data to predict future crime locations and time. The retrieved literature used mining algorithms including statistical test, regression analysis, neural networks, decision tree, Bayesian networks etc. For any kind of crime pattern detection commonly used data mining techniques includes clustering and classification techniques. Generally the detecting effect and accuracy of neural networks are superior than other classification models. General conclusion is that improvement in clustering can improve the classifier evaluation. There is a need to introduce other algorithm for improving clustering techniques. Owing to the size of data samples, some literature reached conclusion based on training samples.
Full Paper

IJCST/42/2/B-1491
73 Literature Survey on Color Image Compression

Ranjodh Kaur, Harbhag Singh, Jagdeep Singh, Sandeep Kaur

Abstract

The need for an efficient technique for compression of Images ever increasing because the raw images need large amounts of disk space seems to be a big disadvantage during transmission & storage. Even though there are so many compression technique which is faster, memory efficient and simple surely suits the requirements of the user. This paper consists of review of some of the color image compression techniques.
Full Paper

IJCST/42/2/B-1492
74 Prevention of Wormhole Attack using an Unobservable Secure Routing Scheme USOR

A. Arun Kishore, P. Vinothiyalakshmi

Abstract

Mobile Ad hoc Networks are infrastructureless networks and this kind of networks are more vulnerable to different types of attacks. There are several nmber of schemes proposed for providing security and protect against various attacks. The wormhole attack also affects the routing at various levels because it is more dangerous to different routing protocols. Our approach for wormhole detection enables the receiver to detect wormhole nodes using unobservable routing .We proposed an efficient method to identify the wormhole nodes exists in the routing path and rediscover new routes from the source node to target node . By applying an improved hop count based detection checking its one hop neighbors from its neighbor table. Once the wormhole nodes are detected then remove the wormhole entries from its neighbor table. we also improve the simulation results with the help of ns2.
Full Paper

IJCST/42/2/B-1493
75 Implementing UPGMA and NJ Method For Phylogenetic Tree Construction Using Hierarchical Clustering

Sukhpreet Kaur, Harwinder Singh Sohal, Rajbir Singh Cheema

Abstract

The research in bioinformatics has accumulated large amount of data. As the hardware technology advancing, the cost of storing is decreasing. The biological data is available in different formats and is comparatively more complex. Knowledge discovery from these large and complex databases is the key problem of this era. Data mining and machine learning techniques are needed which can scale to the size of the problems and can be customized to the application of biology. To construct a phyloenetic tree is a very challenging problem. The main purpose of phylogenetic tree is to determine the structure of unknown sequence and to predict the genetic difference between different species. There are different methods for phylogenetic tree construction from character or distance data. There are different methods to compute distance which include the comparative distance from two sequences, distance using UPGMA and Neighbour Joining. Computing distance from the available sequences is itself an intricate problem and each method has its own merits and demerits. In the present project work, distance is computed using comparative method (scoring using differences) and using UPGMA. Distance data for human phylogenetic problem is considered for the present work. There are different approaches to construct tree. UPGMA and Neighbour Joining Methods are used to retrieve the results. The final trees give the anthromorphical information for the human being. The results are also shown in Hierarchical clustering form.
Full Paper

IJCST/42/2/B-1494
76 A Review of Data Compression Techniques and Data Compression Symmetry

Harpreet Kaur

Abstract

The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing effective data density Data compression is useful in many fields, particularly useful in communications because it enables devices to transmit or store the
same amount of data in fewer bits. Data compression has important
application in the areas of file storage and distributed systems enables devices to transmit or store the same amount of data in fewer bits. Compression is useful because it helps reduce resources usage, such as data storage space or transmission capacity. This paper has proposed the two types of data compression methods using the techniques run- length encoding, burrows-wheeler, scalar quantization and vector quantization and also the Data compression symmetry. Compression can be either lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy. Lossy compression reduces bits by identifying unnecessary information and removing it.
Full Paper

IJCST/42/2/B-1495
77 Segregation of Various Crossover Operators in TSP using G.A

Jyoti Naveen, Rama Chawla

Abstract

The traveling salesman Problem is considered to be a N.P Hard Problem. Genetic Algorithm (GA) is an approximate algorithm that doesn’t always aim to find the shortest tour but to find a reasonably short tour quickly. Crossover operators play important role in G.A. In this paper various crossover operators are used. The results shows that PMX is best among all.
Full Paper

IJCST/42/2/B-1496
78 Result Analysis of BPNN Based Image Compression and Wavelet Transform Based Image Compression

Surbhi, Sandeep Jain

Abstract

Compression algorithms are methods that reduce the number of symbols used to represent source information, therefore reducing the amount of space needed to store the source information or the amount of time necessary to transmit it for a given channel capacity. This paper presents a neural network based technique and wavelet based compression. A three layered Back propagation Neural Network (BPNN) was designed for building image compression system. The Back propagation neural network algorithm (BP) was used for training the designed BPNN. On the other hand we are discussing the wavelet based compression. The wavelet transform is a key ingredient in most state-of-the-art image compression algorithms, The wavelet transform provides a multiscale analysis that is localized in both space and frequency; due to this arrangement, 1-d wavelets provide efficient representations for the large and useful class of piecewise-smooth 1-d signals.
Full Paper

IJCST/42/2/B-1497
79 Comparison Study of Horizontal, Vertical and Combined Noise Image Deblurring Algorithm

Sunny, Kiran Jain

Abstract

An image is a representation of our visual perception. Images are integral part of our technology .But the major problem arises when the image is blue or contain the noise. The blurriness in the image is difficult to avoid and sometimes ruin the complete image. So in this paper we are going to discuss the techniques to deblur the image. We are going to propose the PDE based image deblurring model. On the basis of the PDE model , the comparison of 3 algorithms horizontal , vertical and combined deblurring is done in this paper.
Full Paper

IJCST/42/2/B-1498
80 A Back Propagation Neural Network Based Approach for Recognizing Characters and Digits

Geeta, Sandeep Jain

Abstract

Neural network has a wide application in the field of pattern recognition. In this paper we propose an approach to recognize English characters and digits using Multilayer Perceptron with one hidden layer. The feature extracted from the handwritten character is Fourier Descriptors. To train network, we are using Back Propagation algorithm. The system is trained using different samples. After training, testing of the network is done. It has been shown that we are improving the recognition accuracy and minimizing the training time.
Full Paper

IJCST/42/2/B-1499
81 Enhancing the Performance of Component Based Software Quality Model

Nidhi, Sandeep Jain

Abstract

The performance measurement of component based systems help software architects to evaluate the system performance based on behaviour of their component specifications. Many Researchers have proposed many approaches into this direction but none of them has attained widespread industrial use. One of the main objectives of developing component-based software systems is to focus on reuse of already existing software components.A large number of quality models have been developed for understanding, measuring and predicting quality of software and information systems. So, Quality models should be constructed in accordance to the specific features of the application domain. In this paper, we have conducted a comprehensive state-of-the-art survey of more than 20 of these approaches assessing their applicability and this paper presents challenges to the development of standard, complete and pervasive software quality models, solution to these challenges and their importance is also discussed.
Full Paper

IJCST/42/2/B-1500
82 Solving Travelling Salesman Problem Using Artificial Bee Colony Based Approach

Sahil Sobti, Parikshit Singla

Abstract

This paper mainly explains about the performance of Artificial Bee Colony (ABC) algorithms in solving the Travelling Salesman Problem (TSP). The main goal of TSP is that a number of cities should be visited by a salesman and return to the starting city along with a number of possible shortest routes. As the domain experts are difficult to find & knowledge extraction from the experts itself is difficult task the data driven modeling assume significance, One has to apply soft computing base methodology to generate rule base form data. Neural networks, genetic algorithm & particle swam optimization are some of the approaches [1]. Basic Artificial Bee Colony algorithm (ABC) has the advantages of strong robustness, fast convergence and high flexibility, fewer setting parameters, but it has the disadvantages premature convergence in the later search period and the accuracy of the optimal value which cannot meet the requirements sometimes [4].
Full Paper

IJCST/42/2/B-1501
83 Base Station Controlled and Energy Efficient Centralized Hierarchical Routing Protocol in Wireless Sensor Network

Sumit Wadhwa, Sarita Choudhary

Abstract

Recent technological advances in communication and computation have enabled the development of low-cost, low-power, small in size, and multifunctional sensor nodes in wireless sensor network. Energy saving is the crucial issue in designing the wireless sensor network. The main constraints on the implementation of a wireless sensor network is energy of the node i.e battery life of the nodes. In this paper we intend to design centralized and energy saver hierarchical routing protocol and compare with BCDCP (Base station Controlled dynamic Clustering Protocol) and SHPER(Scaling Hierarchical Power Efficient Routing). To design a new protocol with named Base Station Controlled and Energy Efficient Centralized Hierarchical Routing Protocol (BECH). In BECH, initially the base station request to all nodes to send their neighbour list and residual energy. After having the information about the whole network, the base station performs computation to form the better cluster in such a way that there is less energy consumption. In BECH, the selection of cluster heads is not randomized but is based on the residual energy of the cluster nodes and the logical structure of the whole network. So the life span of the whole network is increased.
Full Paper

IJCST/42/2/B-1502
84 Flexible Congestion Control Routing Protocol

Vandana Chopra, Ashish Chopra

Abstract

In Mobile Ad hoc Networks blockage occurs due to heavy load on network which causes the packet loss In this paper, we are using adaptive routing with congestion control technique for mobile ad hoc networks. In this technique predictive congestion index of a node is used. As we increase the number of connections there will be increase in capacity of nodes to connect with more numbers of requested nodes in a given region. As the requirement of all nodes will fulfill, there will be no delay and no blockage so no congestion will be occur. We have implement this technique using AODV routing protocol with increased length of Gate Size of nodes to avoid congestion in Ad hoc Networks. By simulation results, we have shown that our proposed technique attains high delivery ratio and high performance, low control overhead and reduced delay with relative speed when compared with the existing AODV protocol. Performance of the proposed protocol is implemented on OMNET++ Simulator.
Full Paper

IJCST/42/2/B-1503
85 A Review Of Detection of DDOS Attack
Using Entropy Based Approach

Surender Singh, Sandeep Jain

Abstract

Web-sites acts as the best platforms for attacks like DDOS attack worm propagation and many other attacks which are related to application layer. To detect application layer DDOS attack is a cumbersome task. It is basically originated from the lower layer i.e. network layer and transport layer. Whereas this new application layer based DDOS attacks utilizes genuine HTTP request to make victim resources busy somewhere else which are undetectable. Various tools like hyenae, strut, LOIC, HOIC etc have been used to see the scenario of DDOS attack on various websites. Distributed framework helps to increase the quality of response for genuine traffic under DDOS attacks. A distributed solution is required for the distributed nature of DDOS attack. Hence we will propose architecture for the defense which can efficiently detect these attacks. The behaviors of packet flows will be analyzed. Entropy based detection and trace back algorithm will be used, which efficiently distinguish the malicious flows from the legitimate flows. This work includes simulation and their performance analysis of our proposed framework.
Full Paper

IJCST/42/2/B-1504
86 A Review of IE Protocol to Increase the Uptime of the Wireless Sensor Networks

Bhim Singh, Bhushan Dua

Abstract

The primary goals of WSNs are, use of limited battery powered sensor nodes, minimum energy consumption, maximize network connectivity, and maximize network uptime. Clustering is an effective and energy efficient way to increase the uptime of wireless sensor networks. In clustering protocols, the whole network is grouped in to number of clusters and each cluster has a cluster head. Cluster head is responsible for collect the information from all the non-cluster head nodes and send its information to Base station. There are some challenging issues with clustering protocols i.e.; selection of an optimal group of sensor nodes as Cluster Head (CH), Fault tolerance at the time of node failure for increase connectivity in network. We propose an Interim Election protocol for selection of cluster head node at the time of node failure to mitigate the network partitioning problem in WSNs. In proposed protocol, if cluster head die due to lack of energy or due to natural death, re-election of cluster head to be done at the time of node failure to increase the connectivity and uptime of the network
Full Paper

IJCST/42/2/B-1505