IJCST Logo



 

INTERNATIONAL JOURNAL OF COMPUTER SCIENCE & TECHNOLOGY (IJCST)-VOL 3 ISSUE 1-VER. 1, JAN TO MARCH 2012


International Journal of Computer Science and Technology (IJCST VOL III ISSUE I)
S.No. Research Topic Paper ID
   1 Bandwidth Extension in LTE-Advanced using Carrier Aggregation

Satnam Singh, Amit Kumar, Dr. Yunfei Liu, Dr.Sawtantar Singh Khurmi

IJCST/31/1/
A-393
   2 Literature Survey on Performance of Grid-Based Distributed Parallel Computing
M. Srinivas, V. Kondala, Md. Mohammad Shareef

Abstract

Grid computing provides new techniques to solve for numerous complex problems. It is an inevitable trend to implement the distributed parallel computing of large-scale problems with the grid. This paper presents two implementations for distributed parallel computing on Globus Toolkit, a wide-used grid environment. The first implementation, Loosely Coupled Parallel Services is used to achieve the large-scale parallel computing that can be broken down into independent sub-jobs by using the corresponding implementation framework, and the second implementation, Grid MPI Parallel Program is able to deal with specialized applications, which can’t easily be split up into numerous independent chunks, by using the proposed implementation framework. We make a beneficial attempt to implement distributed parallel computing on grid computing environments.
Full Paper

IJCST/31/1/
A-394
   3 A Novel Approach for Page Rank in Incremental Crawler
Sakshi Goel, Anjana, Akhil Kaushik, Kirtika Goel

Abstract

In this paper we will go through with the enhanced architecture of incremental web crawler with efficient updation for the pages. The incremental crawler selectively updates its database and/ or local assortment of web pages as a replacement for periodically revitalizing the collection in batch mode thereby improving the “newness” of the collection significantly and bringing new pages in timelier manner. We suggest architecture for the incremental crawler, which has an efficient approach for considering a different measure for search engines, namely the eminence of the pages in a search engine key. We endow with a simple, effective algorithm for approximating the quality of an index to determine the Page Rank in the Ranking Module for the page replacement in the Collection.
Full Paper

IJCST/31/1/
A-395
   4 Searching for True Information from Multiple Conflicting Information Providers on the Web
S. Sree Hari Raju, C. Balarengadurai

Abstract

In this project, we present WebIBC, which integrates public key cryptography into web applications without any browser plugin. The public key of WebIBC is provided by identity based cryptography, eliminating the need of public key and certificate online retrieval; the private key is supplied by the fragment identifier of the URL
Full Paper

IJCST/31/1/
A-396
   5 Virtual-Class: An adaptive Gadget Integration Platform
Gopinath Chokkarapu, M.A Jabbar, Srikanth Jatla

Abstract

Here in this paper we proposed a virtual class environment by applying off the shelf cot integration methodologies in a traditional classroom, the model we proposed helps to remove the boundaries between tele-education and physical classroom activities in terms of the teacher’s experience and seamlessly integrates these two currently separate educational practices. More specifically, we replace the legacy desktop based tele education system. Hence the teachers are not opting to mouse and keyboards than marker and whiteboard. In the proposed model, teachers can use multiple traditional approaches while interacting with students who are available on virtual class.
Full Paper

IJCST/31/1/
A-397
   6 Functioning Analysis of Load Balancing Algorithms in a Distributed Computing Environment
Pankaj Saxena, Gaurav Saxena, Saurabh Kumar, Amit Kumar, Dr. Rajendra Belwal

Abstract

Load balancing is the process of amending the functioning of a parallel and distributed organization through a distribution of load among the processors. In this paper we present the performance analysis of various load balancing algorithms based on different arguments, considering two typical load balancing approaches static and dynamic. The analysis suggests that static and dynamic both types of algorithm can have advancements as well as weaknesses over each other. Deciding type of algorithm to be enforced will be based on type of parallel applications to solve. The main intention of this paper is to help in design of new algorithms in future by considering the behavior of various existent algorithms.
Full Paper

IJCST/31/1/
A-398
   7 Cloud Computing-SaaS
Gurudatt Kulkarni, Jayant Gambhir, Rajnikant Palwe

Abstract

Cloud Computing,” to put it simply, means “Internet Computing.” The Internet is commonly visualized as clouds; hence the term “cloud computing” for computation done through the Internet. With Cloud Computing users can access database resources via the Internet from anywhere, for as long as they need, without worrying about any maintenance or management of actual resources. Besides, databases in cloud are very dynamic and scalable. In fact, it is a very independent platform in terms of computing. The best Example of cloud computing is Google Apps where any application can be accessed using a browser and it can be deployed on thousands of computer through the Internet.
Full Paper

IJCST/31/1/
A-399
   8 It and Rural Development: A Study on Government Contribution
Charu Gupta, Vinit Garg, Piyoush Gupta, M.K. Shukla

Abstract

Information technology has revolutionized the rural masses during the last few months. With the advent of new technological solutions, services provided to the rural people (in terms of various services offered) will result in the overall betterment of the society on one side by enriching the people with updated market information and providing latest technological developmental news and organizations on other side by creating more market opportunities for them and adjustment of the market prices. Rural development will greatly affect the economy of the country. For development of the rural areas proper development of the IT Communication and Infrastructural services are essential along with the utilization of the fiber optic networks. I.T. services need to be developed in reference to the present rural infrastructure. World wide web based services mixed with customer support services should be provided in the rural areas, which can increase the acceptance rate of the services by the rural people. The paper attempts to find the catalytic role of government in enhancing the rural sector by enabling them with IT equipped services.
Full Paper

IJCST/31/1/
A-400
   9 E-Commerce Implementation, Problems, Solutions and Popularity in Managing Supply Chain: A Comparative Analysis of Different Top 10 Indian E-Commerce Companies
Kavita, Dr. U.S.Pandey

Abstract

This paper discuss the various factors of E-commerce which managing the supply chain and it also explains the Implementation of emerging E-commerce, problems, solution, and popularity of different and top e-commerce companies. This paper presents the comparative analysis of these top e-commerce companies through the different colored graphs. So through these graphs we can easily analyze the supply chain in online E-commerce.
Full Paper

IJCST/31/1/
A-401
   10 Outlier Detection Techniques and Cleaning of Data for Wireless Sensor Networks: A Survey
Vipnesh Jha, Om Veer Singh Yadav

Abstract

Pattern recognition is the scientific discipline where the goal is the classification of objects into a number of categories or classes. Pattern recognition is an integral part in most sensing networks built for outlier detection.The significant deviations from the pattern of sensed data are considered as outliers in wireless sensor networks. These outliers include noise, errors, and malicious attack on the network. This affects the performance of the wireless sensor networks. Mostly the nature of sensor data is multivariate but it may be univariate also. Because of this, the traditional techniques are not directly applicable to wireless sensor networks. This contribution overviews existing outlier detection techniques developed for wireless sensor networks. It also presents a outlier detection technique framework to be used as a guideline to select a technique for outlier detection suitable for application based on the characteristics, such as, data type, outlier type and outlier degree.
Full Paper

IJCST/31/1/
A-402
   11 Method and Technique of Digitised Land Record Verification using Android Application
Ravi Singh Rana, Yugal Kumar, Dr. Dharmender Kumar

Abstract

Mobile Communication is rapidly changing the life styles of people. We have seen vast deployments of mobile/Wireless technologies such as GSM and UMTS in last decade. About 70% of Indians have a mobile phone, which serves as a medium to remain connected but also a powerful mechanism for empowerment especially for rural population. Government and Industry bodies are trying to encourage the use of mobile phones as tool for delivering various services like information, banking and government schemes. Digitisation of land records is taking place across country. Many central and state agencies are involved in this process. This sometimes creates discrepancies in land records and cause problems for land owners. This application provides land owner with tools that can help resolve some of discrepancies related to land boundaries
Full Paper

IJCST/31/1/
A-403
   12 Explore on Parallel Computing Model of BSP in the NOWs Adapt Environment
M. Srinivas, R. Ravi Kumar, G. Yedukondalu

Abstract

The paper analyze the characteristic of Parallel Computing Model of BSP and NOWs, Explore on Parallel Computing Model of BSP in the NOWs fit Environment. Indicate: some algorithm that designed rationalization parallel computing, acquire an approximately linear accelerated. With the parallel computing algorithm base on the NOWs of linear programming normal improve on simplex method obtain had best result to validate the conclusion.
Full Paper

IJCST/31/1/
A-404
   13 Energy Concerns in Agro Based Monitoring using Wireless Sensor Networks
R. Selva Rama Devi, V.R. Sarma Dhulipala

Abstract

Energy is a mandatory limiting factor for wireless sensor network (WSN) during the design process. Energy is accosted in networking, control and monitoring applications such as habitat monitoring, health care applications, home automation and animal behavior monitoring. Animals’ neck movement is monitored to generate sufficient energy. Movement of the neck is based on existing literature and respective stochastic models. By applying Faraday’s law, voltage is produced from the animal neck movement is converted to power, which serves as energy scavenging for a wireless sensor node. This study proposes that the amount of energy generated by the vertical neck–head movement of sheep during grazing can be converted to useful electrical power adequate to provide power for operation of WSN.
Full Paper

IJCST/31/1/
A-405
   14 Design of UART (Universal Asynchronous Receiver Transmitter) using VHDL
Ananya Chakraborty, Surbhi, Sukanya Gupta, Swati Deshkar, Pradeep Kumar Jaisal

Abstract

UART- Universal Asynchronous Receiver Transmitter, generally it is used for better transmission of serial data that is it either transmit or receives data serially with the help of shift register. It consist frame format, one start bit (usually low), 5-8 data bit, one optional parity bit and one stop bit (opposite polarity of start bit). Asynchronous means by using start and stop bit be transmit data, there is no need of sending (PAD) that is ASCII (SYN) for synchronizing transmitter and receiver. It transmit 9600 to 38400 bps for transmitting data bit. Whole process of serial transmission is based upon the principle of shift register.
Full Paper

IJCST/31/1/
A-406
   15 Component Based Software Engineering using Innovative Patterns
Pranayanath Reddy Anantula, Raghuram Chamarthi (Tejoraghuram)

Abstract

In today’s world of rapidly advancing technology, for any kind of software product, quality is the prime concern. A software product is said to be a quality product if it justifies all the customer requirements. The quality of software can be categorized broadly into two ways, one quality of the design and the second quality of conformance. Quality of design encompasses requirements specification and design of the system and quality of conformance focuses on implementation. The quality of software is intricately connected to the underlying architecture because architecture is the base for further development of the project (Analysis, design and implementation). The work products of architecture refinement are the high level design, low level design, implementation and test cases. The software architecture is the structure of the system which comprises software components, their externally visible properties and the relationship among them. The software architecture is a meta-structure created based on the requirement and specifications. Architecture aids the software engineers to make early design decisions so that it will have a profound impact on software engineering activities that involve in success of the system. The major work to be done in modeling the architecture is to identify qualified, adaptable, updatable components and how these components are associated to build a new product. In the current work, we provide innovative patterns which help in developing the product using component based software engineering. Here we discuss how to create components by applying various innovative patterns like subtraction, multiplication, division, task unification and attribute dependency change patterns .We will also depict the results of applying these patterns to some of the software projects.
Full Paper

IJCST/31/1/
A-407
   16 Energy-Efficient Target tracking Algorithms in Wireless Sensor Networks: An Overview
M.Nandhini, V.R.Sarma Dhulipala

Abstract

Wireless sensor networks find its application in areas such as target detection and tracking, environmental monitoring, industrial process monitoring, and tactical systems. Energy efficiency is one of the important research issues in WSNs, since it determines the lifetime of the sensor network deployed for the intended applications. Target tracking is one of the killer applications of wireless sensor networks and energy-efficient target tracking algorithms are used for accurate tracking. In this paper, the focus is mainly driven over the survey of the energy-efficient target tracking routing algorithms for Wireless Sensor Network.
Full Paper

IJCST/31/1/
A-408
   17 An Approach to Encryption using Superior Mandelbrot and Superior Julia Sets
Charu Gupta, Dr. Manish Kumar

Abstract

The voluminous digital data exchange between various computers has introduced large amount of security vulnerabilities. Encryption schemes have been increasingly studied to meet the demand for real-time secure transmission of data over the Internet and through wireless networks. In this paper, we try to study a new cryptographic key exchange protocol based on superior Mandelbrot and Superior Julia sets. In this study we analyze a cryptographic system utilizing fractal theories; this approach uses concept of public key cryptography by taking advantage of the connection of Superior Julia and Superior Mandelbrot sets. This paper exploits the main feature of public key security
Full Paper

IJCST/31/1/
A-409
   18 Web Usage Mining by Data Preprocessing
Arshi Shamsi, Rahul Nayak, Pankaj Pratap Singh, Mahesh Kumar Tiwari

Abstract

Now a day’s many web sites are growing and which indirectly leads to increase the complexity of web site designing. To simplify this we have to understand how web sites are being used, how navigation is done, and how many users use it and how much time user spends on pages. This is done using Web Usage Mining. Data sources to usage mining are client side cookies, web server log, software agent etc. This paper presents, how web server log data is preprocesses, which includes data cleaning, user identification and Sessionization, path completion. Once the data is preprocessed it is used for discovering some useful patterns.
Full Paper

IJCST/31/1/
A-410
   19 Use of E-Learning in Uttarakhand School Education System: Case Study of Open Source E-Learning Tools for Fundamental Mathematics and Sciences
Darshana Pathak Joshi, Jatin Pandey

Abstract

“Learning is lighting the candle not filling the bucket” and every student has natural curiosity to learn something new. Learning new things and sharing their discovery with friends, teachers, and parents is the source of happiness for students. To help them for becoming more happier and responsible citizens, schools must provide much more than defined learning pedagogies, syllabi and examinations. The need of the hour is to develop a new Attitude towards the student and to address their myriad learning requirements. E- Learning is a powerful tool for shifting toward a new educational domain. Effective e-learning comes from using Information and Communication Technologies (ICT) to broaden educational opportunity and help students develop the skills they need to flourish in the 21st century. While conclusive, longitudinal studies remain to be done, an emerging body of evidence suggests that e-learning can deliver substantial positive effect:For Students it provides Cognitive bases i.e. Processes of model building which, when supported by data and visual representation can aid the deeper insightful learning. Teachers must challenge their students not to be passive receptors of knowledge and have more positive attitude toward their work. Communities benefit from bridging the digital divide. Economically disadvantaged students and children with disabilities benefits particularly. Increasing productivity and efficiency of education system. Technologies like GIS can aid access to a wide range of information related to location by introducing resource education- an immense requirement of school education system particularly in India. This paper summarizes some key findings on present status of use of ICT on school education system and learning paradigms of schools in Uttarakhand and introduces some open source tools for teaching basic mathematics (geometry) and science to students as a collaborative learning platform with brief description of K12. It also addresses the possibilities of use of GIS technology for development of a collaborative and interactive learning platform.
Full Paper

IJCST/31/1/
A-411
   20 Item Set Mining Supported by IMine Index
T.Sunitha, G.Srujana

Abstract

This paper presents the IMine index, a general and compact structure which provides tight integration of item set extraction in a relational DBMS. Since no constraint is enforced during the index creation phase, IMine provides a complete representation of the original database. To reduce the I/O cost, data accessed together during the same extraction phase are clustered on the same disk block. The IMine index structure can be efficiently exploited by different item set extraction algorithms. In particular, IMine data access methods currently support the FP-growth and LCM v.2 algorithms, but they can straightforwardly support the enforcement of various constraint categories. The IMine index has been integrated into the PostgreSQL DBMS and exploits its physical level access methods. Experiments, run for both sparse and dense data distributions, show the efficiency of the proposed index and its linear scalability also for large data sets. Item set mining supported by the IMine index shows performance always comparable with, and often (especially for low supports) better than, state-of-the-art algorithms accessing data on flat file.
Full Paper

IJCST/31/1/
A-412
   21 Design of a Frequent Pattern Mining Based on Systolic Trees
R.P.S.Manikandan, V.Shanmugam

Abstract

In this paper, we have introduced a systolic tree concept to mine the frequently occurring patterns. The term patterns include the set of items, sub structures and the sub sequences. The association rule is to be used for predicting and finding the set of commonly occurring items in the large databases. This process is performed in java as a platform with the use of tomcat server and oracle as a database and we have to show that the performance improvement is higher than the existing. And also that, we have proposed a ranking scheme for frequently occurring patterns and it is visible to all the users, so that those users can get an idea for choosing and buying the right product. The minimum item mining is also to be displayed to the user. We can give a suggestion to the user by displaying ranking scheme and the minimum item mining based on the transactions, so that the user can get a clear idea for buying the product. Hence our paper shows the major improvements in the e-commerce facility.
Full Paper

IJCST/31/1/
A-413
   22 Enhancing Security in Anonymizing Networks
S.Vinoth Kumar, R.Rajendran

Abstract

Anonymous networks allow anyone to visit the public areas of the network. Here users access the Internet services through a series of routers. This hides the user’s identities and IP address from the server. This may be an advantage for the misbehaving users to destroy popular websites. To avoid such activities, servers may try to block the misbehaving user. But it is not possible in case of anonymous networks. Since the servers are not aware of the user’s identities or IP address it is not possible to block the particular misbehaving user. In such cases servers may block the entire network which also affects normal behaving users. To overcome this problem, a system is designed in which servers can blacklist the misbehaving users without compromising their anonymity. The proposed system accurately finds the misbehaving users and also maintains the blacklisted users details in the server. In this system servers can blacklist users for any reason and also the privacy of blacklisted users is maintained.
Full Paper

IJCST/31/1/
A-414
   23 Development of Association Rule Based Prediction Model for Web Documents
Sachin Sharma, Simple Sharma, Anupriya Jain, Rashmi Aggarwal, Seema Sharma

Abstract

The rapid expansion of the WWW has created an unprecedented opportunity to disseminate and gather information online. Electronic Commerce is emerging as the biggest application of WWW. As this trend becomes stronger and stronger, there is much need to study web-user behaviors to better serve the users and increase the value of enterprises. One important data source for this study is the web-log data that traces the user’s web browsing actions. From the web logs, one can build prediction models that predict with high accuracy the user’s next request based on past behavior. To do this with the traditional association rule methods will cause a number of serious problems due to extremely large data size and the rich domain knowledge that must be applied. Most web log data are sequential in nature and exhibit the “most recent-most important” behavior. To overcome this difficulty, we examine two dimensions of building prediction models. This paper proposes a better overall method for prediction model representation and refinement.
Full Paper

IJCST/31/1/
A-415
   24 A Novel Efficient Pattern Matching Packet Inspection by using d FA
N.Kannaiya Raja, Dr. K.Arulanandam, G. Ambika

Abstract

Deep packet Inspection is an advanced method of packet filtering that functions at the Application layer of the OSI reference model. Deep Packet Inspection is a form of computer network packet filtering that examines the data part of a packet as it passes an inspection point, searching for protocol ,viruses ,spam, intrusions or predefined criteria to decide if the packet can pass or it needs to be routed to a different destination, or for the purpose of collecting statistical information. Deterministic Finite Automata (DFAs), use large set of rules need a memory amount that turns out to be too large for practical implementation we have presented a new compressed representation for deterministic finite automata, called Delta Finite Automata. The algorithm considerably reduces the number of states and transitions, and it is based on the observation that most adjacent states share several common transitions, so it is convenient to store only the differences between them. In this paper we have presented an improvement to dFA that exploits the Nth-order dependence between states and further reduces the number of transitions by adopting the concept of temporary transition. This schema named as dnFA. Both the schemes are orthogonal to most of the previous solutions, thus allowing for higher compression rates. A new encoding scheme for states has been also proposed (which we refer to as char state), which exploits the association of many states with a few input chars. Such a compression scheme can be efficiently integrated into dFA and dnFA, allowing a further memory reduction with a negligible increase in the state lookup time. The experimental runs have shown remarkable results in terms of lookup speed as well as the issue of excessive memory consumption.
Full Paper

IJCST/31/1/
A-416
   25 Man- made Object Feature Extraction in SAR Images using Gabor Wavelet
P. Vasuki, Dr. S.Mohamed Mansoor Roomi

Abstract

In this paper, a novel descriptive feature extraction method for Synthetic Aperture Radar (SAR) images is proposed. The new method is based on the Gabor wavelet. This novel Gabor wavelet method by testing over the MSTAR image database. The extracted features are classified using Neural Network. The classification process has the following stages (1) Image preprocessing (median
filtering, histogram equalization, binarization) (2) Feature extraction using Gabor Wavelet Transform. The algorithm has been applied for the three classes of military manmade objects (metal objects) in SAR imagery is using MSTAR public release database. Experimental results are presented.
Full Paper

IJCST/31/1/
A-417
   26 Fundamental of Content Based Image Retrieval
Ritika Hirwane

Abstract

The aim of this paper is to review the present state of the art in Content-Based Image Retrieval (CBIR), a technique for retrieving images on the basis of automatically-derived features like color, texture and shape. Our findings are based both on a review of the relevant literature and on discussions with researchers in the field. There is need to find a desired image from a collection is shared by many professional groups, including journalists, design engineers and art historians. During the requirements of image users can vary considerably, it can be useful to illustrate image queries into three levels of abstraction first is primitive features such as color or shape, second is logical features such as the identity of objects shown and last is abstract attributes such as the significance of the scenes depicted. While CBIR systems currently operate well only at the lowest of these levels, most users demand higher levels of retrieval.
Full Paper

IJCST/31/1/
A-418
   27 Implementation of MA-ABE for Better Data Security in Cloud
Nallamalli Ramesh, Gorantla Praveen, V.P. Krishna Anne, Dr. Rajasekhara Rao Kurra

Abstract

The more sensitive data is shared and stored by the third party references on the Internet, the need of encrypting data stored at these sites. The one drawback of encrypting data, is that it can be selectively shared at a coarse-grained level (i.e; issuing another party our private key), Here we develop a new cryptosystem concept for fine grained sharing of encrypted data that we call multi authority attribute based encryption MA-ABE. The scheme proposed allows any polynomial number of independent authorities to monitor attributes and distribute secret keys. The encryption choose, each authority, a number dk and a set of attributes, can encrypt a message such that a user can only decrypt if he had at least dk of the given attributes from each authority of k. The scheme proposed can tolerate an arbitrary member of corrupt authorities. Here, the focus is done and showed how to apply the techniques to achieve a multi-authority version of the small system.
Full Paper

IJCST/31/1/
A-419
   28 Low Power Structure Design of GLFSR for BISR Based Application
Kakarla Hari Kishore, Dr. Fazal Noor Basha, Pavuluri Srinivas, Atluri Jhansi, Shaik Moulali

Abstract

In this paper, structure design and optimization of a BuiltIn Self-Repair (BISR) design based on Generalized Linear Feedback Shift Registers (GLFSRs) are described. A new and effective pseudorandom test pattern generator, termed GLFSR, is introduced. These are Linear Feedback Shift Registers (LFSR’s) over a Galois field GF. Unlike conventional LFSR’s, which are over GF, these generators are not equivalent to cellular arrays and are shown to achieve significantly higher fault coverage. Experimental results are presented in this paper depicting that the proposed GLFSR can attain fault coverage equivalent to the LFSR, but with significantly fewer patterns. The proposed GLFSR structure will be implemented in SPARTAN-3E by using VHDL. The percentage improvement for fault coverage. This approach reduces the number of transitions in the scan chains and thus minimizing power consumption. By using encoding algorithm, the percentage improvement for power consumption.
Full Paper

IJCST/31/1/
A-420
   29 Background of Surgical Robos and Robotics in Different Surgeries
Y.V.K.D. Bhavani, Y. Vijaya

Abstract

The objective is to review the history, development, and current applications of robotics in surgery. Surgical robotics is a new technology that holds significant promise. Robotic surgery is often heralded as the new revolution, and it is one of the most talked about subjects in surgery today. Up to this point in time, however, the drive to develop and obtain robotic devices has been largely driven by the market. There is no doubt that they will become an important tool in the surgical armamentarium, but the extent of their use is still evolving. A review of the literature was undertaken using Medline. Articles describing the history and development of surgical robots were identified as were articles reporting data on applications. Several centers are currently using surgical robots and publishing data. Most of these early studies report that robotic surgery is feasible. There is, however, a paucity of data regarding costs and benefits of robotics versus conventional techniques. This paper is giving the information about Robotic surgery and it is still in its infancy. Its current practical uses are mostly confined to smaller surgical procedures.
Full Paper

IJCST/31/1/
A-421
   30 Traditional Resource Allocation Algorithms in Wireless Networks
Manish Varshney, Dr. Yashpal Singh, Rati Agrawal

Abstract

In this paper, we study utility-based maximization for resource allocation in the downlink direction of centralized wireless networks. We consider two types of traffic, i.e., best effort and hard QoS, and develop some essential theorems for optimal wireless resource allocation. We then propose three allocation schemes. The performance of the proposed schemes is evaluated via simulations. The results show that optimal wireless resource allocation is dependent on traffic types, total available resource, and channel quality, rather than solely dependent on the channel quality or traffic types as assumed in most existing work. In this paper, we also focus on “user satisfaction” for resource allocation to avoid such a “throughput-fairness” dilemma. Since it is unlikely to fully satisfy the different demands of all users, we turn to maximize the total degree of user satisfaction.
Full Paper

IJCST/31/1/
A-422
   31 Enhanced Software Development for Reuse Process Model in Component Based Software Engineering (CBSE)
Virendra Kumar, Shabina Ghafir

Abstract

In Software Engineering, Component-Based Software Engineering (CBSE) is becoming an important emerging topic and it the centre of many new research projects. Component Based Software Engineering (CBSE) is concerned with the assembly of preexisting software components (reusable components) that leads to a software system that responds to client-specific requirements. In this research paper, We have developed the enhanced software development for reuse process model and gave the future prospect of Component Based Software Engineering (CBSE), and reusability component which will reduce the time ,effort and cost of the software development.
Full Paper

IJCST/31/1/
A-423
   32 A 8×8 Multiplier Testing using Saboteurs and Mutants in VHDL
D.Naga Dilip Kumar, B.K.V.Prasad, M.Sivakumar, Syed.Inthiyaz

Abstract

In order to overcome the problems in structural test in manufacturing we go through the functional test sequences. In this paper, we present a 8×8 multiplier testing using saboteurs and mutants in VHDL for the array multiplier with size 8*8.
Full Paper

IJCST/31/1/
A-424
   33 Analysis of CURE and K-Mediods in Presence of Noise
D.Raghu, Tota Siva Rama Krishna, Ch.Raja Jacob

Abstract

Mining Information and Knowledge patterns from large databases have been recognized by many researchers as key research topic in database systems, Knowledgebase systems, and statistics and in Information providing services. Clustering analysis method is one of the main analytical methods in data mining; the method of clustering algorithm will influence the clustering results directly. Clustering can be applied on database using various approaches based on distance, density, hierarchy and partition. The presence of Noise is a major problem in clustering. Noise is a data item that is not relevant to data mining. The Objective of the paper is present new algorithms for clustering techniques that handles the noise effectively. Our focus is to show the effect of noise on the performance of various types of clustering techniques and to study how noise affects the clustering process in terms of time and space. We have implemented various clustering techniques such as CURE and KMediods. We have computed time complexity and space complexity of various clustering techniques for different number of clusters. These results are presented in various visual presentations like Line Chart, Bar Chart. Then we will conclude which algorithm is more efficient to deal noise.
Full Paper

IJCST/31/1/
A-425
   34 Genetic Algorithm Approach to Intrusion Detection System
B. Uppalaiah, K. Anand, B. Narsimha, S. Swaraj, T. Bharat

Abstract

MThis rapid growth of computer networks for the past decade, security has become a very important issue for computer systems. The detection of attacks by using IDS against computer networks is becoming a major problem to solve in the area of network security. In this paper we are going to present Genetic Algorithm to identify various harmful/attack type of connections. This algorithm takes into consideration of different features in network connections such as a type of protocol type, duration, service, dst_host_srv_count to generate a classification on rule set. Each rule set identifies a specific type of attacks. For this experiment, we implemented Genetic Algorithm and trained it on the KDDCUP99 dataset to generate a set of rules that can be applicable to the IDS to identify and classify different types of attack connections. In this experiment the characters of an attack such as Smurf, Warezmaster, Saint, Mail bomb, multihop, IP sweep, snmpguess, and buffer-overflow were summarized through the KDD99 data set and the effectiveness and robustness of the approach has been proved. These rules will work with high-quality accuracy for detecting the Denial of Service and Probe type of attacks connections and with appreciable accuracy for identifying the U2R and R2L connections. These findings from this experiment have given promising results towards applying GA for Network Intrusion Detection.
Full Paper

IJCST/31/1/
A-426
   35 Identification of Intrusions in Network for Large Data Base using Soft Computing Approach
Tanveer Fatema Khan, Zuber Farooqui, Vineet Richhariya

Abstract

Nowadays Intrusion Detection Systems (IDS) are very important for every information technology company which is concerned with security and sensitive systems. Even if a lot of research was already done on this topic, the perfect IDS have still not been found and it stays a hot and challenging area in computer security research. Current intrusion detection techniques mainly focus on discovering abnormal system events in computer networks and distributed communication systems. Most of the existing IDS use all 41 features in the network to evaluate and look for intrusive pattern some of these features are redundant and irrelevant. The drawback of this approach is time-consuming detection process and degrading the performance of Intrusion detection systems. In this proposed system Principal Component Analysis (PCA) is used to reduce the number of features in KDD dataset. After reducing feature, we have designed fuzzy logic-based system for effectively identifying the intrusion activities within a network. The proposed fuzzy logic-based system can be able to detect an intrusion behavior of the networks since the rule base contains a better set of rules. Here, we have used automated strategy for generation of fuzzy rules, which are obtained from the definite rules using frequent items. The experiments and evaluations of the proposed intrusion detection system are performed with the KDD Cup 99 intrusion detection dataset. The experimental results clearly show that the proposed system achieved comparable and some cases higher rate in identifsying whether the trasaction (records) in network are intrusive activity or normal activity.
Full Paper

IJCST/31/1/
A-427
   36 Analysis of Association Rule Mining using Bayesian Network
D.Raghu, P.Jagadeesh, CH. Raja Jacob

Abstract

The Trivial association rule mining which should be fixed in order to avoid both that early trivial rules is retained and also that interesting rules are not discarded. In fact the situations which use the relative comparison to express association rules are more complete than the situations that generate association rules that use the absolute comparison. The traditional concept of association rule mining will lose some information in generating association rules. The user does have to determine the degree of support and confidence thresholds before generating association rules. In our (in this) system it proposing new approach in finding association rules. This new approach uses the concept of Bayesian network classification to generate association rules. This provides a way for decision maker to get more information to generate association rules than traditional approach. The new approach for revving association rules has the ability to handle the certainty in the classifying process so that we can reduceinformation loss and enhance the result of data mining. This new algorithm can simulate the value of probability which is based on continuous data set
Full Paper

IJCST/31/1/
A-428
   37 Retinal Blood Vessel Topography Detection Based on Nonlinear Space Invariant Diffusion Process
Joshi Manisha Shivram, Dr.Rekha Patil, Dr. Aravind H. S

Abstract

Retinal vessels topography provides useful information to clinical diagnosis and treatment.Hence segmentation and quantification of blood vessel topography is of central interest in many diseases. A method is presented for segmentation of retinal blood vessels particularly for the detection of small blood vessels. We present comparative analysis of linear scale space paradigm with nonlinear space invariant diffusion process based scale space for vasculature segmentation. The performance is evaluated using DRIVE database. The overall sensitivity and accuracy with non linear space invariant diffusion process is 77.04% and 94%.
Full Paper

IJCST/31/1/
A-429
   38 Comparison of Conditional Functional Dependencies using Fast CFD and CTANE Algorithms
D.Raghu, K.Thatha Reddy, Ch Raja Jacob

Abstract

Conditional Functional Dependencies (CFDs) are an extension of Functional Dependencies (FDs) by supporting patterns of semantically related constants, and can be used as rules for cleaning relational data. However, finding CFDs is an expensive process that involves intensive manual effort. To effectively identify data cleaning rules, we take 4 techniques for cleaning the data from sample relations. CFD Miner, is based on techniques for mining closed item sets, and is used to detect constant CFDs, namely, CFDs with constant patterns only. It provides a heuristic efficient algorithm for discovering patterns from a fixed FD. It leverages closed-item set mining to reduce search space. CTANE works well when the arity of a sample relation is small and the support threshold is high, but it scales poorly when the arity of a relation increases. Fast CFD is more efficient when the arity of a relation is large. Greedy Method formally based on the desirable properties of support and confidence. It studying the computational complexity of automatic generation of optimal tables and providing an efficient approximation algorithm. These techniques are already implemented in the previous papers. We take algorithms of these 4 techniques and find out time and space complexity of each algorithm to know which technique will be helpful in which case and display the results in the form of line and bar charts
Full Paper

IJCST/31/1/
A-430
   39 Secure Multimedia Data using H.264/AVC and AES Algorithm
Deepali P. Chaudhari, Vaibhav Eknath Narawade

Abstract

The demand of multimedia information is increasing rapidly. Therefore, multimedia security has become one of the most aspects of communications with the continuous increase in the use of digital data transmission. Many approaches are there to secure multimedia information. In this paper Selective Encryption is proposed along with compression on multimedia information. This paper proposed a new system of video encryption. The proposed system aim Selective Encryption is performed by using the Advanced Encryption Standard (AES) algorithm and Compression is performed using H.264/AVC Standard. The system includes two main functions; first is the encoding/encryption of video stream, through the execution of two processes (the input sequences of video is first compressed by the H.264/AVC encoder, and the encoded bit stream (I-frame) is partially encrypted using AES block cipher). And the second function is the decryption/decoding of the encrypted video through two process (specify the encrypted I-frame stream, decryption of the I-frame, and decoding with H.264/AVC decoder).
Full Paper

IJCST/31/1/
A-431
   40 Software Testing using Genetic Algorithm
Sanjay Kumar Sonkar, Dr.Anil Kumar Malviya, Dharmendra Lal Gupta, Ganesh Chandra

Abstract

Testing is a process used to identify the correctness, completeness and quality of developed computer software. Testing, apart from finding errors, is also used to test performance, safety, faulttolerance or security. Testing is the most important quality assurance measure for software. Testing is time consuming and laborious process. Therefore, techniques to automatic test data generation would be useful to reduce the cost and time. Software testing is an important and valuable part of the software development life cycle. Due to time, cost and other circumstances, exhaustive testing is not feasible that’s why there is a need to automate the software testing process. Testing effectiveness can be achieved by the State Transition Testing (STT) which is commonly used in real time, embedded and web based type of software systems. The objective of this paper is to present an algorithm by applying a Genetic Algorithm Technique, for generation of optimal and minimal test sequences for behaviour specification of software. Present paper approach generates test sequence in order to obtain the complete software coverage.
Full Paper

IJCST/31/1/
A-432
   41 Association Rule Mining using Count Distribution
D.Raghu, BH.Prasanth, Ch. Raja Jacob

Abstract

One of the important problems indata mining is discovering association rules from databases of transactions where each transaction consists of a set of items. The most time consuming operation in this discovery process is the computation of the frequency of the occurrences of interesting subset of items (called candidates) in the database of transactions. To prune the exponentially large space of candidates, most Existing algorithms consider only those candidates that have a user defined minimum support. Even with the pruning, the task of finding all association rules requires a lot of computation power and memory. Parallel computers offer a potential solution to the computation requirement of this task, provided efficient and scalable parallel algorithms can be designed. In this paper, we have implemented Sequential, Parallel and Count Distribution mining of Association Rules using Apriori algorithms and evaluated the performances of three algorithms on the basis of Time and Space.
Full Paper

IJCST/31/1/
A-433
   42 Image Retrieval using Upper Mean Color and Lower Mean Color of Image
Dr. N. S. T. Sai, R. C. Patil

Abstract

This present the new idea of image retrieval using upper mean color and lower mean color of image as feature vector. Feature vector calculated by using bit planes of every image . This paper compares performance of 4 bit planes and 8 bit planes for gray scale and color image. So feature vector of proposed method is varies in accordance with the number of bit planes used. Proposed method tested on the database which includes 930 images having 10 different classes. We use simple Euclidean distance to compute the similarity measures of images for Content Based Image Retrieval application. The average precision and average recall of each image category and over all precision and recall is considered for the performance measure.
Full Paper

IJCST/31/1/
A-434
   43 Performance Evolution of Positive and Negative Association and Confined Rules
D.Raghu, A.Nalini, Ch.Raja Jacob

Abstract

Positive and negative association rules are important to find useful information hided in massive datasets, especially negative association rules can reflect mutually exclusive correlation among items. Despite a great deal of research, a number of challenges still exist in mining positive and negative association rules. In order to solve the problem of “difficult to determine frequent item sets” and “how to delete contradictive positive and negative association rules”, the paper presents a new algorithm for mining positive and negative association rules. The algorithm applies a new measurement framework of support and confidence to solve the problems existing. The performance study shows that the method is highly efficient and accurate in comparison with other reported mining methods. We propose an algorithm that extends the support-confidence framework With a sliding correlation coefficient threshold. In addition to finding confident positive rules that have a strong correlation, the algorithm discovers negative association rules with strong negative correlation between the antecedents and consequents. And we implement the performance evaluated on the basis of time and space complexity of positive and negative association rule and confiding rules implemented.
Full Paper

IJCST/31/1/
A-435
   44 Performance Analysis of Efficient and Scalable Multicast Routing Protocol Over Mobile Ad-Hoc Networks using NS-2
V. Bharathi, R. Sofia

Abstract

Efficient Geographic Multicasting Protocol (EGMP) came into existence to implement group communication in MANET, the efficiency and scalability of the protocol was already tested using Global Mobile Simulation (GloMoSim), but this paper test the Packet Loss, Throughput and Routing Overhead in packet transmission by using Network Simulator-2 (NS-2), as NS-2 helps in finding the functionality of the protocol in real time environment, at lower cost. In this EGMP uses a MAC layer protocol IEEE 802.15.4 SSCS with the data rate of 250kbps and the radio frequency 2.4GHz. Here a network wide Zone based bidirectional tree is constructed to achieve the efficient group membership management. Every node is aware of its own position which efficiently reduces the overhead for route searching and also comparing EGMP with MAODV (Multicast Ad-Hoc OnDemand Distance Vector Routing Protocol), EGMP has Efficient Bandwidth utilization by having lower control overhead, lower delay in packet transmission, and higher throughput.
Full Paper

IJCST/31/1/
A-436
   45 Memory Hierarchy Based Performance Review of Sorting Algorithms
D.Abhyankar, M.Ingle

Abstract

Sorting is known to be one of the fundamental computing problems which continue to attract a great deal of research. It is evident that instruction count requirements of the key sorting algorithms are well understood. However, the cost of executing a sorting algorithm is influenced by factors apart from instruction count such as cache miss and page faults as well. There exist studies of sorting algorithms from memory hierarchy perspective, but these studies are stale and losing relevance in the context of modern computer architecture. We have experimented with Quicksort, Merge sort, and Heapsort to assess their respective performances in memory hierarchy of modern computing environments. Inferences derived from the outcome of the experiments have been presented in this paper. Empirical evidence confirms that Quicksort is still an excellent algorithm from memory hierarchy point of view. Also, it has been inferred that Merge sort is poor on large records because its page fault count is too high. Heapsort was found competitive enough to be considered as an algorithm of choice on large records in the view of its lower page fault count.
Full Paper

IJCST/31/1/
A-437