IJCST Logo

 

INTERNATIONAL JOURNAL OF COMPUTER SCIENCE & TECHNOLOGY (IJCST)-VOL III ISSUE IV, VER. 3, OCT. TO DEC., 2012


International Journal of Computer Science and Technology Vol. 3 Issue 4, Ver. 3
S.No. Research Topic Paper ID
   86 A Channel Allocation Algorithm for Efficient QoS Provisioning in Wireless Networks

Abhishek Kumar Gupta, Dr. B.K. Gupta, Dr. Mohan Lal

Abstract

Next generation high-speed cellular networks are expected to support multimedia applications, which require QoS provisions. Since radio spectrum is the most expensive resource in wireless networks, it is challenge to support QoS using limited radio spectrum. However, efficient use of such limited spectrum becomes more important when there is a heavy traffic in the network. The use of available channels has been shown to improve the system capacity. The role of channel assignment scheme is to allocate channels to cells in such way as to minimize call-blocking probability or call dropping probability and also maximize the throughput. This paper presents a new hybrid channel allocation (HCA) which is the combination of fixed channel allocation (FCA) and dynamic channel allocation (DCA). A network becomes heavy when the bandwidth available in the cells is not enough to sustain he sers demand and call will be blocked or dropped. The result shows that HCA algorithm ignificantly reduces the call-blocking probability in heavy traffic load and can improve he andwidth utilization while providing QoS guarantees. Furthermore, all channels will be laced in a central pool and on demand will be assigned to the base station. When a call sing such a borrowed channel terminates, the cell may retain the channel depending upon its urrent condition therefore HCA has comparatively much smaller number of reallocations than ther schemes. It also shows that it behaves similar to the FCA at high traffic and to the A at low traffic loads as it is designed to meet the advantages of both.
Full Paper

IJCST/34/3/A-1181
   87 This Paper is Removed due to Copyright Issue
   88 Software Reliability Models and Procedure for Fitting a Software Reliability Model

Syed Khasim, Dr. R. Satya Prasad

Abstract

Software reliability is a useful measure in planning and controlling resources during the Software development process, so that high quality software can be developed. It is also a useful measure for giving the user confidence about software corrections. A number of analytical models have been proposed during the past twenty years for evaluate the reliability of a software system. In this paper we present Software reliability models and procedure for fitting a software reliability modeling.
Full Paper

IJCST/34/3/A-1183
   89 Object Oriented Software Security in Design Phase

Dr. Mohammad Ubaidullah Bokhari, Shams Tabrez Siddiqui, Hatem S. A. Hamatta

Abstract

Object oriented design and development has become popular in today’s software development environment. As development of object oriented software is raising security problem is also increasing. The software security focuses on the effort and cost spent in lateral phases is much greater than the initial phase of software development process. It is understandable that most of the software is not well designed with respect to security concern. Therefore, to secure the software after deployment is not only costlier but difficult too. The aim of the paper is to study better way of design consideration to secure the object oriented software and discuss the tools which are required for the secure software development. This paper will provide an opportunity to understand the requirement for developing a technique at design phase.
Full Paper

IJCST/34/3/A-1184
   90 The Survey-Web Search Results Clustering and Basic of Clustering Process

Dr. P. N. Chatur, K. P. Wagh, Naresh G. Gadge

Abstract

As the number of documents on the web has proliferated, the low precision of conventional web search engines and the flat ranked list presentation make it difficult for users to locate specific information of interest. Grouping web search results into a hierarchy of topics provides an alternative to the flat ranked list and facilitates searching and browsing. In this paper, we have a brief survey of previous work on web search results clustering and basic steps of web search results clustering.
Full Paper

IJCST/34/3/A-1185
   91 Analysis of Penetration Testing and Vulnerability Assessments with New Professional Approach

Azhar Ali, Dr. Mohd. Rizwan Beg, Shish Ahmad, Arshad Ali

Abstract

In today’s modern era crucial company information is accessed, stored, and transferred electronically. The security of this information and the systems storing this information are critical to the reputation and prosperity of companies. Therefore, vulnerability assessment of computer systems to obtain a complete evaluation of the security risks of the systems under investigation. In current era there is more complex enterprise IT infrastructures consist of hundreds or thousands of systems. Each component of these infrastructures is meticulously configured and integrated into complex systems architecture. Professional IT staffs are responsible for securely establishing and maintaining these IT infra structures are assessing, on an ongoing basis, the real risks presented by system vulnerabilities. Attacks against computer systems and the data contained within these systems are becoming increasingly frequent and evermore sophisticated. Advanced Persistent Threats (APTs) can lead to ex filtration of data over extended periods. Organizations wishing to ensure security of their systems may look towards adopting appropriate measures to protect themselves against potential security breaches. One such measure is to hire the services of penetration testers (or “pen-tester”) to find vulnerabilities present in the organization’s network, and provide recommendations as to how best to mitigate such risks. This paper discusses the definition and role of the modern pen-tester and summarizes current standards and professional qualifications. The paper further identifies issues arising from pen-testers, highlighting differences from what is generally expected of their role in industry to what is demanded by professional qualifications. In this paper we can analysis of The paper further identifies issues arising from pen-testers, highlighting differences from what is generally expected of their role in industry to what is demanded by professional qualifications. In this paper we provide an overview of penetration testing, discuss security vulnerabilities, and summarize the results and benefits of penetration testing realized by the IT executives interviewed.
Full Paper

IJCST/34/3/A-1186
   92 An Artificial Neural Networks Approach with NP Hardness for Efficiency Approximation

A. S. Gousia Banu, Dr. T. Swarnalatha, Hymavathi

Abstract

An Artificial Neural Networks approach with NP hardness for efficiency approximation is presented in this paper. The aim of present research is to show that finding the solution and efficiency approximation to the solution of the global optimization problem using NP-hard problem. NP hardness is used as the training algorithm of an artificial neural networks approach for calculating the efficiency approximation. In the ANN initial condition for the input–hidden connection, the left sigmoid function is used as an activation function and for the hidden–output inter connection, the right sigmoid function is used as an activation function.
Full Paper

IJCST/34/3/A-1187
   93 Face Recognition Under varying Lighting Conditions and Noise Using Texture Based and SIFT Feature Sets

Vishnupriya. S, Dr. K. Lakshmi

Abstract

Recognizing face images more reliable even under uncontrolled lighting conditions is one of the most important challenges for practical face recognition systems. We tackle this by combining the strengths of robust illumination normalization, local texturebased face representations,multiple feature fusion. Additionally we propose SIFT which is an approach for detecting and extracting local feature descriptors.By combining the results produced from both the above mentioned approaches,we increase the accuracy of recognizing the given image from group image even with difficult lighting conditions and noise.
Full Paper

IJCST/34/3/A-1188
   94 A Structure to Improve Software Process Improvement Model for Small Industry

Manoj Kumar, Dr. Sohan Lal

Abstract

Software process improvements are required to increase the productivity of software companies. It is a systematic approach to improve the capabilities and performance of software organizations. Improved software quality is critical to ensure reliable products and services and to increase customer confidence. One basic idea is to assess the organizations’ current activity and improve their software process on the basis of the competencies and experiences of the practitioners working in the organization. In the work presented in this paper, we have find out what SPI activities in those models that are important for small organizations a survey was conducted. In the survey both supplying and purchasing organizations participated. The survey resulted in a prioritization of SPI activities, which represented the base for the new SPI model. The authors named the proposed model SPISC – Software Process Improvement for Small Corporations.
Full Paper

IJCST/34/3/A-1189
   95 Reward based Action Learning from Vehicle Mounted Camera Images for Real-Time Simulation and Rendering of Indian Traffic

Dr. G. Murugaboopathi, G.Sankar

Abstract

A traffic simulation model with detailed simulation and realistic rendering serves the purpose of global traffic prediction as well as individual driver behavior analysis. Though there has been research in traffic simulation for years, the focus has been mainly on centralized approaches. The main drawback with the centralized approaches is that they are quite intractable, unstable and unrealistic. Recently many decentralized models are developed to reproduce global parameters like traffic flow rate from collective behavior of intelligent agents. But decentralized approaches which simulate local parameters like individual speed preferences do not consider the lane-less chaotic traffic existing in most Indian roads. They do not provide sufficient details for realistic rendering. As of now, there are no detailed simulation models available for realistic rendering that captures the essential features of Indian traffic scenario. The proposed work tries to avoid the above mentioned problems and aims to develop a decentralized system for detailed simulation and realistic rendering of Indian traffic. Some of the nationally significant applications of the proposed tool are urban planning, training novice drivers and serving as a test bed for new traffic implementations apart from games and entertainment. From fig. 1, it can be seen that research in traffic monitoring and simulation was started a century ago. Over the years, advanced techniques such as loop detectors and sensors have refined the traffic simulation models. Recently with the availability of devices and tools to capture and process real-time videos, there are new inputs to the learning algorithms used in traffic simulation. Loop detectors and road-side cameras are known to provide global traffic information such as vehicle type and density in a road, traffic jam and queue length at traffic junctions. But they are not sufficient to develop detailed simulation models which require realistic rendering. Though vehicle velocity may be measured using vehicle mounted devices, that alone is not sufficient for the task. We need to learn the driver actions with respect to the acceleration and break controls to simulate the realistic path of the vehicle on the road. This emphasizes the need to use vehicle mounted cameras for capturing vehicle traffic on a road.
Full Paper

IJCST/34/3/A-1190
   96 Wireless Personal Area Network Node Design with RFID Using ZigBee Transceiver Module

Reena Rani, Dr. Rajesh Singh

Abstract

Radio frequency Identification (RFID) systems and Wireless Personal Area Networks (WPANs) are the two different technologies that have created revolutionaries in the research and development field. Though these both are different technologies and have different applications area but if these two technologies are integrated together then several doors will open in research field. In this chapter we investigate and design the steps to Integrate Radio Frequency Identification (RFID) system and Wireless Personal Area Networks (WPAN) and identify new area of application like marking attendance, locate the person or any other important asset in colleges, schools as well as in industries by designing personal area network in the real environmental condition and then the location can be track without any wired network. In this chapter we will try to demonstrate a real designed prototype of integrated technologies.
Full Paper

IJCST/34/3/A-1191
   97 Application of K-means Clustering Algorithm for Mining Educational Data

Alok Singh Chauhan, Varun Singh Chauhan, Narendra Kumar Sharma

Abstract

Higher education is a vital area for any successful country. All research and inventions mostly comes from higher education. Top most valuable professional like engineers, managers, scientists are comes from the channel of higher education. It would be better for institutions and universities to effectively trace the moving path of its students and alumni. Depending on the path of its passed out students universities can change its current strategy and forthcoming ones. Data mining is used to extract meaningful information and to develop significant relationships among variables stored in large data set/ data warehouse. K-means clustering is a popular clustering algorithm based on the partition of data. Clustering analysis method is one of the main analytical methods in data mining; the method of clustering algorithms will influence the clustering result directly. This proposed paper discusses the K-means clustering algorithm and analyzes student’s educational data. Here K-means clustering method is used to discover knowledge that come from educational environment.
Full Paper

IJCST/34/3/A-1192
   98 Enhancing the Performance of the Website through Web Log Analysis and Improvement

Arvind K. Sharma, P.C. Gupta

Abstract

Web log data offers valuable information insight into website usage. It represents the activity of many users over a potentially long period of time. In this paper we have analyzed the Web logs to determine different statistics like user’s hourly, daily and monthly average access statistics: such as number of Visits, Hits, total Pages, Files, URLs, Referrers, User Agents, total Kbytes, top search Strings, most popular requested Pages, and most popular Images. The complete web log data of one year have been collected from the Website’s main web server of an Educational Institution and an attempt has been made to enhance the performance of the Website through Web log analysis.
Full Paper

IJCST/34/3/A-1193
   99 Effective Quantum Key Authentication in Optical Networks

G. Murali, R. Siva Ram Prasad, G. Venkateswarlu

Abstract

Secure distribution of the secret random bit sequences known as “key” material is an essential precursor to their use for the encryption and decryption of confidential communications. Quantum cryptography is a new technique for secure key distribution with single-photon transmissions: Heisenberg’s uncertainty principle ensures that an adversary can neither successfully tap the key transmissions, nor avoid detection (Eavesdropping raises the key error rate above a threshold value). We have developed experimental quantum cryptography systems based on the transmission of nonorthogonal photon states to generate shared key material over multi-kilometer optical fiber paths and over line-of-sight links.
Full Paper

IJCST/34/3/A-1194
   100 Framework for Educational Data Access Using Cloud Environment

Tejaswi Avula, Radhika Gudapati, Subrahmanyam Kodukula

Abstract

As an emerging technology and business paradigm, Cloud Computing has taken commercial computing by storm. Cloud computing platforms provide easy access to a company’s highperformance computing and storage infrastructure through web services. With cloud computing, the aim is to reduce the complex work load for students carrying huge books on their back. By combining new technology’s like cloud computing,3G,Tablet pc student can carry anything to everywhere for this we are creating a new virtual appliances called Random Access Data(RAD)to get better and fast information from provider. With proposing system Student need to carry only a single tablet pc to any required information for their studies.
Full Paper

IJCST/34/3/A-1195
   101 A Novel Framework for Video Search

Sunil Parihar, Kshitij Pathak

Abstract

The technologies are growing continually and due to this there are a large network of social networking and media collection library is increased. Particularly searching for a video from media libraries on the network has become a challenging task. Presently video search is mainly based on text, titles, descriptions and image features associated with it. There are many methods have been developed to improve the video search performance but they don’t provide high accuracy on top ranked documents. In this paper we present a novel framework that integrates multiple features and help us to improve the video search performance in terms of relatedness of documents. We use semantic mapping and feedback policy to gain high accuracy on top ranked result. The proposed framework may be the most promising framework to gain high accuracy on the top ranked documents. We will provide the result on the basis of two performance parameters namely lost query result and ghost query result.
Full Paper

IJCST/34/3/A-1196
   102 Security of Wireless Sensor Network Using Random Key Chaining

Puneet Kumar Kaushal, Ranjana Singh Rathore, Amandeep

Abstract

In today’s technology driven era systems are required that are capable of utilizing sensors and actuators, supports real-time processing, has wireless communication and has computing and communication capabilities with multiple type of memories. Sensing environmental changes and processing the retrieved data is one of the essential requirements of daily life and real time applications. As a result Wireless Sensor Network (WSN) was developed [1]. Wireless Sensor network is made of hundreds and thousands of sensor nodes, with sensing, computation and communication capabilities and these nodes co-operatively monitor physical and environmental conditions, such as temperature, sound, vibration, pressure, motion, pollutants, at different locations. In WSN, sensors sense physical and environmental changes wherever they are deployed. This sensed information is transferred from source node to sink node for analysis. This information is very sensitive and need to be secured from adversaries. In this paper a new mode of operation RKC is suggested to implement on Wireless Sensor Networks in order to provide security.
Full Paper

IJCST/34/3/A-1197
   103 Half Bridge ZVS DC-DC Converter with DCS PWM Active Clamp Technique

J. Sivavara Prasad, Y. P. Obulesh, Ch. Saibabu, S. Ramalinga Reddy

Abstract

Half Bridge (HB) DC–DC converter is an attractive topology for middle power level applications owing to its simplicity. This paper presents a new control scheme, to be known as duty-cycle shifted PWM (DCS PWM) control, is proposed and applied to the conventional HB dc–dc converters to achieve ZVS for one of the two switches without adding extra components and without adding asymmetric penalties of the complementary control. The concept of this new control scheme is shifting one of the two symmetric PWM driving signals close to the other, such that ZVS may be achieved for the lagging switch due to the shortened resonant interval. Moreover, based on the DCS PWM control, an active clamp branch comprising an auxiliary switch and a diode is added across the isolation transformer primary winding in the half bridge converter to achieve ZVS for the other main switch by utilizing energy stored in the transformer leakage inductance. Moreover, the auxiliary switch also operates at ZVS and ZCS conditions. In addition, the proposed topology with DCS PWM control eliminates the ringing resulting from the oscillation between the transformer leakage inductance and the switches junction capacitances during the off-time period. Therefore, the proposed converter has a potential to operate at higher efficiencies and switching frequencies.
Full Paper

IJCST/34/3/A-1198
   104 Experimental Comparison of Classifier Accuracy on Cancer Gene Expression Data by Optimal Feature Subset Selection

Parul C. Patel, Mahesh H. Panchal

Abstract

The classification of cancer based gene expression data is one of the most important procedures in bioinformatics. Bioinformatics is defined as the science of organizing and analyzing the biological data. Feature selection plays important role in Classification. Generally, cancer dataset have less number of samples compared to number of genes (features) involved. Feature selection is a necessary task to be accomplished before classification process, as it becomes difficult to train a classifier for a data set having high dimensions (features) and hence, cannot give optimum result. It has the capability to extract the relevant features smoothly from a high dimensional data set having noise; imprecise and redundant information. This paper includes experimental study of accuracy of the classifiers before and after reducing the features of the dataset. To measure the accuracy of classifier, two induction algorithms are used: Decision tree and Naïve Bayes.
Full Paper

IJCST/34/3/A-1199
   105 Topology Control in Wireless Ad-Hoc Networks-Shared Links

K. Rajani Kumari, N. Nagendra Gopal

Abstract

Mutual Communication (MC) allows save power and extend transmission coverage. However, prior research work on topology control considers MC only in the aspect of energy saving, not that of coverage extension. We identify the challenges in the development of a centralized topology control scheme, named Mutual Bridges, which reduces transmission power of nodes as well as increases network connectivity. Multiple nodes to simultaneously transmit the same packet to the receiver so that the combined signal at the receiver can be correctly decoded. Since MC can reduce the transmission power and extend the transmission coverage, it has been considered in topology control protocols. However, prior research on topology control with MC only focuses on maintaining the network Connectivity, minimizing the transmission power of each node, whereas ignores the energy-efficiency of paths in constructed topologies. This may cause inefficient routes and hurt the overall network performance. In this paper, to address this problem, we introduce a new topology control problem: energyefficient topology control problem with mutual communication, and propose two topology control algorithms to build mutual energy spanners in which the energy efficiency of individual paths are guaranteed. Simulation results confirm the nice performance of the proposed algorithms.
Full Paper

IJCST/34/3/A-1200
   106 Full Bridge DC-DC Step-Up Converter With ZVZCS PWM Control Schem

J. Sivavara Prasad, Y. P. Obulesh, Ch. Saibabu, S. Ramalinga Reddy

Abstract

Full Bridge (FB) DC–DC converter is an attractive topology for high power level applications. Pulse Width Modulation (PWM) current-fed full bridge dc–dc step-up converters are typically used in applications where the output voltage is considerably higher than the input voltage. In this paper, a comparison is made between two converter topologies of this type—the standard Zero-Voltage Switching (ZVS) active-clamp topology and a new Zero-Current Switching (ZCS) topology. This paper begins with a review of the operation of the ZVS active-clamp converter and that of ZCS converters in general; the advantages and disadvantages of each approach are stated. A new ZCS-PWM current-fed dc–dc step up full-bridge converter is then introduced. Finally, a comparison of the performance of the two converters is made and conclusion based on this comparison is stated. Finally the full bridge DCDC buck converter is simulated with the help of MATLAB simulink.
Full Paper

IJCST/34/3/A-1201
   107 Multi-Layer Perceptron Network For Handwritting English Character Recoginition

Mohit Mittal, Tarun Bhalla

Abstract

Handwriting recognition has been one of the most emerging, fascinating and challenging research areas in field of image processing and pattern recognition. Handwriting recognition is classified into two types as off-line and on-line handwriting recognition methods. Offline character is means that recognizing the character from the document. In case of online, handwriting recognition the machine recognizes the writing while the user writes. Both use different techniques for recognizing the characters. In this paper, offline handwritten character recognition is discussed .
Full Paper

IJCST/34/3/A-1202
   108 A Comparative Analysis of Edge Detection Operators: Application for Text Detection

Rana Gill, Navneet Kaur, Ashwani Kumar Singla

Abstract

Edge detection is a fundamental tool in image processing and computer vision.The main objective of image analysis is to extract dominating information. Edge detection is image segmentation method based on discontinuity in intensity values. The purpose of the work is to extract and recognize text from camera captured images based on edge detection operators. Five operators used are Prewitt, Roberts, Sobel, LoG and Canny and towards the end all these operators are compared on the basis of results and its matlab simulation.
Full Paper

IJCST/34/3/A-1203
   109 Analytical View of Web Content Adaptation

Neha Gupta

Abstract

Use of Internet for educational purposes has grown to a great extent and handheld devices are being used to access Internet by school children and by their parents. School children and parents also use mobile devices for getting information related to their home assignment. School children and parents also use mobile devices for getting information related to their home assignment. This means the users can access web using a PC at home or at office and can access the same information on their mobile phone while traveling. Although there are advancements in technical and bandwidth aspects, still mobile devices are limited by small screen sizes which limit the amount of information that can be displayed at one time. Mobile browsers display the content on mobile devices using two main transformation methods: direct migration and linear. In direct migration, no transformations are made to the original web page. While in columnar (or linear) approach, page areas are presented one after another in a single column. The presentation of information available on the website is changed to a long linear list that can easily fits within the small screen constraint of the mobile device. The major advantage of this approach is that horizontal scrolling is not required. Most of the HTML web pages are not supported by internet enabled mobile handheld devices because the web pages may not be properly and speedily displayed on the micro browsers of the devices due to low memory capacity, small screen size, limited computing power ,narrow network bandwidth, & resources etc. Web usage mining a branch of web mining can be helpful in summarizing the web pages for these devices. Web usage mining helps in data gathering, navigation pattern discovery, pattern analysis etc & hence helps in improving the readability and download speed of mobile web pages.
Full Paper

IJCST/34/3/A-1204
   110 Stress Recognization in Physiological Signals from Drivers by Time & Frequency Domain

Dhananjay Kumar, Lakshmi Sahitya U, Kumar Gaurav Shankar

Abstract

The work deals with the fatigue analysis of automobile drivers by acquiring their physical signals during real world driving task to determine the driver’s relative stress level. The ECG signals were evaluated using two methods one using time domain analysis and second method using frequency domain analysis. In time domain analysis Pan Tomkins method is used for the detection of QRS complex and then determines the width of QRS complex and determines the heart rate of the driver. The second method frequency domain analysis determines the correlation of QRS Complex with the marker signal and with the help of correlation coefficients we can determine the stress. Marker signal was given by the evaluators by observing the drivers at various levels. These findings indicate that physiological signals will provide a metric of driver stress in future cars capable of physiological monitoring.
Full Paper

IJCST/34/3/A-1205
   111 An Effective Review on File Download Time in Peer-to-Peer Networks

Sharada Kethireddy, M. N. S. Lakshmi

Abstract

The peer 2 peer Networks are widely used for file sharing and file transferring which uses 70% of internet bandwidth. The average download time of a file is an important performance metric in the peer 2 peer networks. Service capacity and network congestion are the two major factors have a significant role in the performance analysis of file downloading. We point out that the common approach of analyzing the average download time based on average service capacity is fundamentally flawed. The average download time in P2P networks increased by Both spatial heterogeneity and temporal correlations .we proposed a simple, distributed algorithm to effectively remove these negative factors, thus minimizing the average file download time for each user in the peer 2 peer network. By the proposed approach the average file downloading time or file transfer rate is effectively managed in the p2p networks.
Full Paper

IJCST/34/3/A-1206
   112 Full Bridge DC-DC Step-Down Converter With ZVZCS PWM Control Scheme

J. Sivavara Prasad, Y. P. Obulesh, Ch. Saibabu, S. Ramalinga Reddy

Abstract

Full Bridge (FB) DC–DC converter is an attractive topology for high power level applications. A new Zero-Voltage and Zero- Current-Switching (ZVZCS) Full-Bridge (FB) Pulse Width Modulation (PWM) converter is proposed to improve the performance. In this converter, all switches must be ON or OFF only either voltage across switch or current through switch is zero. By adding a secondary active clamp and controlling the clamp switch moderately, ZVS (for leading-leg switches) and ZCS (for lagging-leg switches) are achieved without adding any lossy components or the saturable reactor. Many advantages including simple circuit topology, high efficiency, and low cost make the new converter attractive for high-voltage and high-power (>10 kW) applications. The principle of operation is explained and analyzed. The features and design considerations of the new converter are also illustrated. Finally the full bridge DC-DC step-down converter is simulated with the help of MATLAB simulink.
Full Paper

IJCST/34/3/A-1207
   113 Towards An Effective Fuzzy Keyword Search and Ranking Framework for File InformationManagement System

R Mabubasha, B. Suresh, Sudeep Nair, GVK Kishore

Abstract

Due to the rapid increasing of data in File Information management Systems(FIMS), we have to use the scalable data retrieval system(IR Style) for results efficiency. Although many Data Retrieval Systems(DRS) are proposed by different authors previously, all of them have their own inherent limitations to retrieve efficient results. Some DRS are depends on eithertextual content of files, folder structure and keyword co-existence etc. In this paper we are introducing a multi-dimensional framework, which collaborates the various file properties to determine the relevance of a file for the given fuzzy keyword query and the results are ranked by TF*IDF Ranking Strategy. Our framework first considers the various dimensions of file like content,folder-structure, metadata and evaluates the relevance score according to each dimension of a file individually. Later this framework integrates all dimension scores to generate a meaningful uni?ed score.This score is evaluated by TF*IDF ranking strategy, which proven the obtained top rank results for a fuzzy keyword search are most relevant and scalable then previous Data Retrieval Systems.
Full Paper

IJCST/34/3/A-1208
   114 Web Comprehension by UML Stereotypes

Chothmal Choudhary

Abstract

Web applications use components developed in various technologies through an abstraction space richer than that of the object oriented paradigm. The architecture of web applications can be represented by showing specific web components, their compositions, navigations and inter-component relationships. In this research, we propose a component-centric UML based approach for modeling the architecture of web applications. Our approach is based on a classification of components and intercomponent relationships that typically occur in web applications. We use UML extension mechanisms to represent specific web components.
Full Paper

IJCST/34/3/A-1209
   115 An Approach to: Identify and Retrieve Subsequence Matching Frames from Videos

R Shiva Shankar, D Ravi Babu, J Rajanikanth, K V S Murthy

Abstract

Retrieving a duplicate video is becoming more and more important with the exponential growth. Though various approaches have been proposed to manipulate a large video database, effective video indexing and retrieval, these are mainly focusing on the retrieval accuracy while infeasible to query on Web scale video database in real time. This paper proposes a novel method for efficiency and scalability issues for similar frames through video. We Opposed a novel method to inserting a distinct pattern into the video stream, video copy detection techniques match contentbased signatures to detect copies of video. Existing typical contentbased copy detection schemes have relied on image matching. To effectively match the video sequences with a low computational load, we use the key frames extracted by the cumulative directed divergence and compare the set of key frames using the novel method and g-l technique. Motion, intensity and color-based signatures are compared in the context of copy detection. Results are reported on detecting copies of movie clips. To demonstrate the effectiveness and efficiency of the proposed method, we evaluate its performance on an open video data set containing about 10K videos and compare it with four existing methods in terms of precision and time complexity.
Full Paper

IJCST/34/3/A-1210
   116 Efficiently Detecting the Top Nearest Neighbor Objects in Spatial Database

Kumar Vasantha, Dhana Krishna. K

Abstract

Spatial database systems manage large collections of geographic entities, which apart from spatial attributes contain spatial information and non spatial information. An attractive type of preference queries, which select the best spatial location with respect to the quality of facilities in its spatial area. User preference queries are very important in spatial databases. With the help of these queries, one can found best location among points saved in database. In many situation users evaluate quality of a location with its distance from its nearest neighbor among a special set of points There has been less attention about evaluating a location with its distance to nearest neighbors in spatial user preference queries. This problem has application in many domains such as service recommendation systems and investment planning. Related works in this field are based on top-k queries. The problem with top-k queries is that user must set weights for attributes nd a function for aggregating them. This is hard for him in most cases. In this paper a new type of user preference queries called spatial nearest neighbor skyline queries will be introduced in which user has some sets of points as query parameters. For each point in database attributes are its distances to the nearest neighbors from each set of query points. By separating this query as a subset of dynamic skyline queries N2S2 algorithm is provided for computing it. This algorithm has good performance compared with the general branch and bound algorithm for skyline queries.
Full Paper

IJCST/34/3/A-1211
   117 End-to-End Congestion Control for TCP

K. Pavan Kumar, Y. Padma

Abstract

Reliable transport protocols such as TCP are tuned to perform well in traditional networks where packet losses occur mostly because of congestion. However, networks with wireless and other lossy links also suffer from significant losses due to bit errors and handoffs. TCP responds to all losses by invoking congestion control and avoidance algorithms, resulting in degraded end-to-end performance in wireless and lossy systems. The proposed solutions focus on a variety of problems, starting with the basic problem of eliminating the phenomenon of congestion collapse, and also include the problems of effectively using the available network resources in different types of environments (wired, wireless, high-speed, long-delay, etc.). In a shared, highly distributed, and heterogeneous environment such as the Internet, effective network use depends not only on how well a single TCP based application can utilize the network capacity, but also on how well it cooperates with other applications transmitting data through the same network. Our survey shows that over the last 20 years many host-to-host techniques have been developed that address several problems with different levels of reliability and precision. There have been enhancements allowing senders to detect fast packet losses and route changes. Other techniques have the ability to estimate the loss rate, the bottleneck buffer size, and level of congestion.
Full Paper

IJCST/34/3/A-1212
   118 Professionally Providing Security to Cloud using Nonlinear Programming

Chakrapani Avala, Raja Sekhar. M

Abstract

Cloud computing economically enables customers with limited computational resources to outsource large-scale computations to the cloud. However, how to protect customers’ confidential data involved in the computations then becomes a major security concern. In this paper, we present a secure outsourcing mechanism for solving large-scale systems of non linear equations (NLE) in cloud. It provides a practical mechanism design which fulfils input/output privacy, cheating resilience, and efficiency. In the proposed approach practical efficiency is achieved by explicit decomposition of NLP into NLP solvers running on the cloud and private NLP parameters owned by the customer. When compared to the general circuit representation the resulting flexibility allows exploring appropriate security/efficiency trade-off via higher-level abstraction of NLP computations. It is possible to construct a set of effective privacy-preserving transformation techniques for any problem, by framing a private data possessed by the client for NLP problem as a combination of matrices and vectors, which allow customers to transform original NLP problem into some arbitrary value while defending sensitive input or output information. To confirm the computational result, the fundamental duality theorem of NLP computation should be explored and then derive the essential and adequate constraints that a accurate result must satisfy.
Full Paper

IJCST/34/3/A-1213
   119 Comparison of Standard K-Means and Modified K-Means Algorithms- Initial Centers Derived from Data Partitioning

Pratti Bhagya Rekha, Bandi Hari Krishna, P N V S Chaitanya, Nataraj Guadapaty

Abstract

One of the most widely used clustering techniques is the k means algorithm. It has been applied in many fields of science and technology. Solutions obtained from this technique are dependent on the initialization of cluster centers that are assigned randomly. The major problem of the k-means algorithm is that it may produce empty or small clusters depending on initial center vectors because of random initialization. Our paper presents a modified versions of the k-means algorithms that efficiently eliminates this empty or small cluster problem. we propose algorithms to compute initial cluster centers for K-means clustering. The algorithms will partition the whole space into different partitions and calculates the frequency of data point in each partition. The partition which shows maximum frequency of data point will have the probability to contain the centroid of cluster. The algorithms keep the centers of any two partitions far apart as possible. The calculated data point of the K partitions become the initial cluster centers for K-means. We propose different methods to initialize cluster centers like equidistance method, equal data partitioning method, k-means++ method of initializing centers using weighted probability. Preliminary experiments show that the our proposed algorithms improves both the speed and the accuracy of k-means. Comparison of algorithms is done by considering time complexities using bar charts and line charts.
Full Paper

IJCST/34/3/A-1214
   120 Web Mining: Methodologies, Algorithms and Applications

SK. Ahmad Mohiddin, D. Babu Rajendra Prasad

Abstract

The World Wide Web is a popular and interactive medium to disseminate information today. It is a system of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia, and navigate between them via hyperlinks. With the recent explosive growth of the amount of content on the Internet, it has become increasingly difficult for users to find and utilize information and for content providers to classify and catalog documents on the World Wide Web. Traditional web search engines often return hundreds or thousands of results for a search, which is time consuming for users to browse. On-line libraries, search engines, and other large document repositories (e.g. customer support databases, product specification databases, press release archives, news story archives, etc.) are growing so rapidly that it is difficult and costly to categorize every document manually. To deal with these problems web mining is used. Web mining is the use of data mining techniques to automatically discover and extract information from the web documents and services. This paper presents an overview of web mining, its methodologies, algorithms and applications.
Full Paper

IJCST/34/3/A-1215
   121 Minimization of Churn and Load of Continuous Queries in P2P Networks

Bandaru Durga Sri, K. V. Krishna Rao, Malladi Lakshmi Narayana

Abstract

This paper presents CoQUOS a scalable and lightweight middleware to support continuous queries in unstructured P2P networks. A key strength of this Technique is that it can be implemented on any unstructured overlay network. It preserves the simplicity and flexibility of the overlay network. It is a completely decentralized scheme to register a query at different regions of the P2P network. It includes two novel components, namely cluster resilient random walk and dynamic probability-basedquery registration technique The looselycoupled and highly dynamic nature of underlying P2P network. This paper focuses on the issues that are of particular importance to the performance of the CoQUOS system, namely Churn of the P2P overlay and Load distribution among peers.
Full Paper

IJCST/34/3/A-1216
   122 Help Senior Citizen: Web Service Management System

B.V.A Swamy, P. Satya Narayan

Abstract

Service oriented computing is a new and novel paradigm that holds real promise to move the field of computing from data centric to service centric view. The web service is defined as functionality that can be programmatically accessible via the web.web service provides an efficient vehicle for users to access the functionalities available on the web. one-key problem is context web services is the lack of any systematic methodology for managing the entire life cycle of the web services including automatic service composition , service query optimization, service privacy preservation, service trust and change management .This paper describes web service management system called “Help-Senior Citizen” Which is service oriented government system that aims at providing services to the senior citizens .and used web services as in frame work for implementing and deploying digital government. This paper also describes process to perform extensive experiments to access the performance of web service management system and its key component.
Full Paper

IJCST/34/3/A-1217
   123 Awareness of Coordination Problems in Collaborative Software Development Environments

Kiran Kumar Siripurapu, Chakrapani Avala

Abstract

Modern software engineering inevitably involves teams of developers collaboratively working on software artifacts. Challenges to success include both the size (millions of lines of code, thousands of classes) and complexity of the software under development. However, increasingly, individual developers may be physically separated perhaps to the extent that they are in different time zones. Similarly, the software artifacts may also be arbitrarily distributed/ replicated at the developer’s locations. Supporting effective collaboration between software engineers is itself a difficult problem: supporting effective collaboration between physically separated software engineers remains an open problem. In this paper, instead of taking for granted the social actors involved in the coordination of work through awareness, we unpack how software developers in their daily work identify this set of actors. This is necessary to properly understand how collaboration is achieved in software development efforts and to allow computational support for awareness. The work reported in this paper also provides an understanding of which and how different aspects (e.g., the organizational setting) facilitate the identification of these actors. By addressing these issues, we can design better collaboration tools that facilitate the coordination of work, especially software development work. As we mentioned before, this has not been studied in previous studies of collaborative work, neither in software development nor in other domains.
Full Paper

IJCST/34/3/A-1218
   124 Simulation Comparison of AODV and DSDV using TCP and UDP Traffic Patterns

Gurjeevan Singh, Meenakshi Sharma, Karamjeet Singh, Hitesh Uppal

Abstract

This research paper presents a comprehensive simulation study of Adhoc on Demand Distance Vector Routing Protocol (AODV) and Destination Sequenced Distance Vector Routing Protocol (DSDV) using different mobile traffic patterns (TCP and UDP) for Mobile Ad hoc networks. The performance of above mentioned routing protocols are analysed using various metrics like packet loss, end to end delay and bandwidth using Network Simulator (NS-2).
Full Paper

IJCST/34/3/A-1219
   125 Virtual Path Topologies Based OnTravel Production Scheme

Pemmasani Venkata Vasavi

Abstract

The simple but rigid path and advance functionalities in IP base environments, efficient resource management and control solutions against dynamic travel conditions is still yet to be obtained. In this article, we introduce AMPLE — an efficient travelproduction and management scheme that performs adaptive travel control by using multiple virtualized path topologies. The proposed scheme consists of two complementary components: offline link weight optimization that takes as input the physical network topology and tries to produce maximum path path diversity across multiple virtual path topologies for long term operation through the optimized setting of link weights. Based on these diverse paths, adaptive travel control performs intelligent travel splitting across individual path topologies in reaction to the monitored network dynamics at short timescale. According to our evaluation with real network topologies and travel traces. Travel dynamics in order to avoid network congestion and subsequent service disruptions is one of the key tasks performed by contemporary network management schemes.
Full Paper

IJCST/34/3/A-1220
   126 Energy: The Future of Global Growth

Rajendra P Swargam, Venkat R Settipally, Nagaraju Ullingala, Saurabh Jain

Abstract

Energy is the lifeblood of the global economy a crucial input to nearly all of the goods and services of the modern world. This paper focus on creating new insights, and a platform for stakeholders and industry to act upon some of the most important energy issues. Globally and nationally, the “architecture of energy systems” is undergoing significant change. Governments, industry and other stakeholders are seeking new solutions to ensure energy systems and requirements of economic growth, sustainability and energy security. This paper emphasizes areas of growth in energy production and what follows in terms of jobs and value creation.
Full Paper

IJCST/34/3/A-1221
   127 Substantiation Matching Over Inquiry Results from Various Network Databases

A. Srinivasa Rao, Ch. RajaJacob

Abstract

In a Network database scenario, most state-of-the-art evidence identical methods such as SVM, OSVM, PEBL, and Christen are efficient in IR systems. But such methods require huge training data sets for pre learning. Earlier to address this problem Unsupervised Duplicate Detection (UDD) a inquiry-dependent evidence identical method was developed. For a given inquiry, it can effectively identify duplicates from the inquiry results of various Network databases. Non duplicate evidences from the same source can be used as training examples. Starting from a non duplicate set, UDD uses two cooperating classifiers, a weighted component similarity summing classifier and an SVM classifier that iteratively identifies duplicates in the inquiry results from various Network databases. For String Similarity calculation UDD uses any kind of similarity calculation method. We propose to use a faster better string similarity calculation (Sim String using SWIG) foroptimizing the performance of UDD. Evidence identical is an essential step in duplicate detection as it identifies evidences representing same real-world entity. Supervised evidence identical methods require users to provide training data and therefore cannot be applied for network databases where inquiry results are generated on-the-fly. To overcome the problem, a new evidence identical method named Unsupervised Duplicate Elimination (UDE) is proposed for identifying and eliminating duplicates among evidences in dynamic inquiry results. The idea of this paper is to adjust the weights of evidence fields in calculating similarities among evidences. Two classifiers namely weight component similarity summing classifier, support vector machine classifier are iteratively employed with UDE where the first classifier utilizes the weights set to match evidences from different data sources. With the matched evidences as positive dataset and non duplicate evidences as negative set, the second classifier identifies new duplicates. Then, a new methodology to automatically interpret and cluster knowledge documents using an ontology schema is presented. Moreover, a fuzzy logic control approach is used to match suitable document cluster(s) for given patents based on their derived ontological semantic networks. Thus, this paper takes advantage of similarity among evidences from network databases and solves the online duplicate detection problem.
Full Paper

IJCST/34/3/A-1222
   128 Reliable Computing Under Resources Constraints Policy

S. Anitha Reddy, V. Uma Maheswari

Abstract

Hardware-based trusted computing platforms are intended to overcome many of the problems of trust that are prominent in computing systems. In this paper, a result of the Software Engineering Institute’s Independent Research and Development Project “Trusted Computing in Extreme Adversarial Environments: Using Trusted Hardware as a Foundation for Cyber Security,” we discuss the capabilities and limitations of the Trusted Platform Module (TPM). We describe credential storage, device identity, chains of trust, and other techniques for extending hardwarebased trust to higher levels of software-based infrastructure. We then examine the character of trust and identify strategies for increasing trust. We show why acceptance of TPM-based trust has been limited to date and suggest that broader acceptance will require more focus on traditional trust issues and on end-to-end services.
Full Paper

IJCST/34/3/A-1223
   129 Improved BN Model for Regular and Interactive Image Segmentation

Y. Siva Prasad, K. V. Srinivasa Rao

Abstract

We propose a new enhanced Bayesian network (BN) model for both automatic and interactive image segmentation. A multilayer BN is constructed from an over segmentation to model the statistical dependencies among superpixel regions, edge segments, vertices, and their measurements. The BN also incorporates various local constraints to further restrain the relationships among these image entities. The proposed system construct structures for the background and foreground images, i.e. two phase image constructor, and clusters the structured grouped pixels by performing the BN model for easy image segmentation. Given the BN model and various image measurements, belief propagation is performed to update the probability of each node. Image segmentation is generated by the most probable explanation inference of the true states of both region and edge nodes from the updated BN. Besides the automatic image segmentation, the proposed model can also be used for interactive image segmentation. While existing interactive segmentation (IS) approaches often passively depend on the user to provide exact intervention, we propose a new active input selection approach to provide suggestions for the user’s intervention. Such intervention can be conveniently incorporated into the BN model to perform actively IS. We evaluate the proposed model on both the Weizmann dataset and VOC2006 cow images. The results demonstrate that the BN model can be used for automatic segmentation, and more importantly, for actively IS. The experiments also show that the IS with active input selection can improve both the overall segmentation accuracy and efficiency over the IS with passive intervention
Full Paper

IJCST/34/3/A-1224