IJCST Logo

 

INTERNATIONAL JOURNAL OF COMPUTER SCIENCE & TECHNOLOGY (IJCST)-VOL IV ISSUE III, VER. 4, JULY. TO SEPT, 2013


International Journal of Computer Science and Technology Vol. 4 Issue 3, Ver. 4
  S.No. Research Topic Paper ID
    112 Region Based Method Used in Image Segmentation and Object RecognitionHimmat Singh Rana, Pawan Kumar Mishra

Abstract

The digital image processing system is one of the main applications used in object detection and image segmentation. The digital image processing has been applied in several areas, especially where it is necessary to use tools for feature extraction and to get patterns of the studied images. In an initial stage, the segmentation is used to separate the image in parts that represents an interest object that may be used in a specific study. There are several methods that intend to perform such task, but it is difficult to find a method, that can easily adapt to different type of images, that often are very complex or specific. To resolve this problem, this work aims to presents an adaptable segmentation method that can be applied to different types of images, providing a better segmentation. The proposed method is based on a model of automatic multilevel thresholding and considers techniques of group histogram quantization, analysis of the histogram slope percentage and calculation of maximum entropy to define the threshold. This paper presents a general approach to image segmentation and object recognition that can adapt the image segmentation algorithm parameters to the changing environmental condition. Segmentation parameters are represented by a team of generalized stochastic learning automata and learned using connectionist reinforcement learning techniques. The edge-border coincidence measure is first used as reinforcement for segmentation evaluation to reduce computational expensed associated with model matching during the early stage of adaptation. This measure alone, however, cannot reliable predict the outcome of object recognition. Therefore, it is used in conjunction with model matching where the matching confidence is used as a reinforcement signal to provide optimal segmentation evaluation in a closed-loop object recognition system. The adaptation alternates between global and local segmentation processes in order to achieve optimal recognition performance. Results are presented for both indoor and outdoor color image where the performance improvement over time is shown for both image segmentation and object recognition.
Full Paper

IJCST/43/4/C-1704
    113 A Principle Factor Analysis Based on Euclidean Distance With Normalization Techniques for Illumination of Invariant Face RecognitionC.V.Arulkumar, S.Sampath Kumar, S.Vignesh

Abstract

Changes in lighting condition impacts the appearance of faces to a large extent. The work presented here compares the performance of the illumination compensation methods of face images like DCT normalization, Wavelet Denoising, Gradient Faces, Local Contrast Enhancement, and Weber Faces under different lighting conditions. The face images are preprocessed and normalized by each method to reduce the effect of illumination. Then the features of these preprocessed images are extracted using Principal factor Analysis (PFA) and the recognition is based on Euclidean Distance. In this paper, the advantages and drawbacks of each method are analyzed. The recognition rate and computational time of these methods are compared using Extended Yale B database.
Full Paper

IJCST/43/4/C-1705
    114 A Novel Approach for Maintaining Integrity and Correctness of Storage in Cloud ComputingRadhika Paladugu, Yanumula Sankara Rao

Abstract

The cloud storage provides users to easily store their data and enjoy the good quality cloud applications need not install in local hardware and software system, So benefits are clear, such a service is also gives users’ physical control of their out sourced data, which provides control over security problems towards the correctness of the storage data in the cloud. In order to do this new problem and further achieve a secure and dependable cloud storage services, we propose in this paper a flexible distributed storage integrity auditing mechanism, using the homomorphism token and distributed erasure coded data. We are also proposing allows users to audit the cloud storage with very lightweight communication and computation cost. The auditing result not only ensures strong cloud storage correctness guarantee, but also simultaneously achieves fast data error localization, i.e., the identification of hacker information And securely introduce an effective TPA, the auditing process should bring in no new vulnerabilities towards user data privacy, and introduce no additional online burden to user. In this paper, we propose a secure cloud storage system supporting privacy preserving public auditing. We further extend our result to enable the TPA to perform audits for multiple users simultaneously and efficiently. This shows the proposed scheme is highly efficient and data modification attack, and even server colluding attacks.
Full Paper

IJCST/43/4/C-1706
    115 An Efficient and Secure Architecture Achieving Untracebility of Client in WMNI. Subhashini, N.Srinu

Abstract

As users of networks increasingly aware of their privacy needs, the importance of anonymity is gaining popularity. The reason behind it is that anonymity can hide the actual identity of end users while allowing their to access services of network or web site. More over they are allowed to do so without being traced. This usage is prevailing in P2P systems and also payment based networks like e-cash. Achieving anonymity and being able to trace misbehaving users are the two conflicting requirements. In this paper, we propose security architecture to ensure unconditional anonymity for honest users and traceability of misbehaving users for network authorities in WMNs. The proposed architecture strives to resolve the conflicts between the anonymity and traceability objectives, in addition to guaranteeing fundamental security requirements including authentication, confidentiality, data integrity, and non repudiation.
Full Paper

IJCST/43/4/C-1707
    116 Robustly Detecting and Eliminating the Conflicts in Firewall PoliciesAnnapareddi Surendrababu, S.Jalaiah, P.Pedda Sadhu Naik

Abstract

Firewalls are a widely deployed security mechanism to ensure the security of private networks in most businesses and institutions. The effectiveness of security protection provided by a firewall mainly depends on the quality of policy configured in the firewall. However, designing and managing firewall policies are often errorprone due to the complex nature of firewall configurations as well as the lack of systematic analysis mechanisms and tools. This paper represents an innovative anomaly management framework for firewalls, adopting a rule-based segmentation technique to identify policy anomalies and derive effective anomaly resolutions. PolicyVis presented in this paper provides visual views on firewall policies and rules which gives users a powerful means for inspecting firewall policies.
Full Paper

IJCST/43/4/C-1708
    117 Compressed-Sensing-Enabled Video Streaming for Wireless Multimedia Sensor NetworksSweeta Jallal, A. Krishna Mohan

Abstract

Day by day the flows of data is rapidly growing around us and increasing enormously, where as the number if salient features of the data is usually much smaller than the number of co-efficient in data representation. Hence to overcome this the enormous data need to be compressed to go along with the data coefficient this is called compressed sensing or compressive sampling. In this paper we have discussed the design issues of networked system for joint compression, rate control and error correction of video over resource-constrained embedded devices based on the theory of compressed sensing.
Full Paper

IJCST/43/4/C-1709
    118 Energy Minimization for Wireless Sensor Networks Using Opportunistic RoutingY. Ramesh Kumar, B. Ramesh Babu, G. Chandra Sekhar

Abstract

Wireless sensor networking research has received considerable attention in recent years as it represents the next phase of networking evolution. Efficient and reliable routing of data from the source to destination with minimal power consumption remains the crux of the research problem. Source privacy is one of the looming challenges that threaten successful deployment of these sensor networks, especially when they are used to monitor sensitive objects. In order to enhance source location privacy in WSN Opportunistic routing scheme is used. In this each sensor node transmits the packet over a dynamic path to the destination. Energy minimized Opportunistic routing technique is used for the efficient utilization of the power. The energy can be efficiently utilized by reducing transmission power of each node. By using the Energy minimized Opportunistic routing algorithm packets are sent in a secured manner with minimal power.
Full Paper

IJCST/43/4/C-1710
    119 Scalable and Reliable Protection of location Privacy in Wireless Sensor NetworkR.L.Sudharupa, G.Sunnydeol

Abstract

A wireless Sensor Network (WSN) is composed of numerous small sensing devices with limited communication range. The sensors collect data from the environment and report them to the sinks. With the promising sensing and wireless technologies, sensor networks are expected to be widely deployed in a broad spectrum of civil and military applications. Location information of the sinks, the sensors, and the objects being tracked are very important in sensor networks. Protecting location privacy in sensor networks is crucial considering different kinds of attacks that may disrupt the normal function of the networks. In this paper, through a Linear Programming (LP) framework, we analyze lifetime limits of WSNs protecting event-unobservability with different proxy assignment methodologies. We show that to maximize the network lifetime data flow should pass through multiple proxies that are organized as a general directed graph rather than as a tree.
Full Paper

IJCST/43/4/C-1711
    120 Mining Web Using Diffusion Graph With RecommendationA.Sasi Devi, P.Suresh Babu

Abstract

Extensive contents generation in web becoming a challenging task to the recent researchers. This explosive content increased the demand of recommendation technique usage to reach the users service. We use different kind of recommendation in web every day including music, movies, images, books, tag recommendation etc. irrespective of data model used in recommendation it can be modeled in the form of various types of graph. In this paper, we first propose a diffusion method which propagates similarities between different nodes and generates recommendations; and then we illustrate how to generalize different recommendation problems into our graph diffusion. This proposed framework can be utilized in any recommendation tasks on the Web.
Full Paper

IJCST/43/4/C-1712
    121 Improved Version of Web Crawler for Efficient Domain Specific SearchCh.Sri Devi, S.Gopi Krishna

Abstract

Domain-Specific Search solutions focus on one area of knowledge, creating customized search experiences, that because of the domain’s limited corpus and clear relationships between concepts, provide extremely relevant results for searchers. Domain-specific search engines are becoming increasingly popular because they offer increased accuracy and extra features not possible with general, Web-wide search engines. Unfortunately, they are also difficult and time consuming to maintain. We describe new research in reinforcement learning, text classification and information extraction that enables efficient spidering, populates topic hierarchies, and identifies informative text segments. Using these techniques, we have built a demonstration system: a search engine. This paper, proposes the Standard Web Crawler and its improved version with advanced algorithm for efficient results on Domain Specific Search.
Full Paper

IJCST/43/4/C-1713
    122 Eradicating Data Duplication of the Clustering Result Using WTQP.Vasanthi, Ch.Swapna Priya, P. Suresh Babu

Abstract

The data generated by conventional categorical data clustering is incomplete because the information provided is also incomplete. This project presents a new link-based approach, which improves the categorical clustering by discovering unknown entries through similarity between clusters in an ensemble. A graph partitioning technique is applied to a weighted bipartite graph to obtain the final clustering result. It plays a crucial, foundation role in machine learning, data mining, information retrieval and pattern recognition. The experimental results on multiple real data sets suggest that the proposed link-based method almost always outperforms both conventional clustering algorithms for categorical data and well-known cluster ensemble technique. This paper proposing an Algorithm called Weighted Triple- Quality (WTQ), which also uses k-means algorithm for basic clustering To introduce a minhash algorithm to avoid the data duplication in different cluster and also Secure Information Retrieval (SIR )data from the final cluster ensemble result.
Full Paper

IJCST/43/4/C-1714
    123 Optimizing Data Security in Cloud ComputingD.Yallamanda, M.Krishna Siva Prasad

Abstract

Cloud Computing has been envisioned as the next generation architecture of IT Enterprise. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, Cloud Computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this article, we focus on cloud data storage security, which has always been an important aspect of quality of service. The data protection-as-a-service cloud platform architecture dramatically reduces the per-application development effort required to offer data protection while still allowing rapid development and maintenance In this paper The Diffie Hellman key exchange algorithm is used for proving Data Production in clouds.
Full Paper

IJCST/43/4/C-1715
    124 Data Protection & Maliciaos Insiders Detection in CloudR.Gopiraju, D.Demudubabu, G.Chinnababu

Abstract

Cloud computing provides a huge data services to the user in an easy and comfortable way over the internet. A major feature of the cloud services is that The data shared by the users are processed remotely with unknown machines which are not concern or own by the user. Data protection in cloud during rich application sharing is became a challenging task. We introduce a new platform for cloud computing called Data Protection as a Service, which significantly reduces the risk and provide data protection to the remote system. Also we propose a different approach for securing data in the cloud using offensive decoy technology. We monitor data access in the cloud and detect abnormal data access patterns. When unauthorized access is suspected and then verified using challengequestions, we launch a disinformation attack by returning large amounts of decoy information to the attacker. This protects against the misuse of the user’s real data.
Full Paper

IJCST/43/4/C-1716
    125 Identifying Top K Results Over XML DataD.Roja Rani, G.Padmaja

Abstract

In this paper a new technique named as fuzzy type-ahead search in XML data it searches XML data on the fly as the user types in query keywords. It allows users to explore data as they type, even in the presence of minor errors of their keywords. Our proposed method has the following Advantages: (1) Search as you type: It extends Auto complete by supporting queries with multiple keywords in XML data. (2) Fuzzy: It can find high-quality answers that have keywords matching query keywords approximately. (3) Efficient: Our effective index structures and searching algorithms can achieve a very high interactive speed. We propose effective index structures and top-k algorithms to achieve a high interactive speed. Our method achieves high search efficiency and result quality. Here To reduce the index size, we incorporate a scheme namely schema-aware dewey code, into our structure-aware indices.
Full Paper

IJCST/43/4/C-1717
    126 SPRT: Automatic Detection of Compromised Machines in Campus NetworkD.Sameera, G.Sanjiv Rao

Abstract

Machines used by crackers are the main security threats on the web. They launch various attacks to spread the malware and DDoS. The systems used by the crackers are called as compromised systems and are involved in the spamming activities, commonly known as spam zombies. In this paper, we discuss a system called SPOT which is used for spam zombie detection considering the outgoing messages of a network. SPOT is designed using Sequential Probability Ratio Test, which is a statistical tool based on false positive and false negative error rates. In addition, we also compare the performance of SPOT with two other spam zombie detection algorithms based on the number andpercentage of spam messages forwarded by internal machines, respectively, and show that SPOT outperforms these two detection algorithms.
Full Paper

IJCST/43/4/C-1718
    127 Multiple Instance Learning for Auction Fraud DetectionG.Subrahmanyam, A.V.D.N.Murthy, P.Suresh Babu

Abstract

Since the increasing demand of the internet online shopping and online auction have got more popularity among the people. While these transaction are enjoying by the people criminals are also actively participating to get illegal profit. There are many approach to detect the fraud and give protection to the online transaction but still we are missing some where to be success .Machine-learned models, that is help of human tuned rule based system. This paper, proposes an online model which takes feature selection and coefficient bounds from human knowledge as a key knowledge. By rigorous experiments we have concluded that this model can reach the expectation of the people and help them from online fraud.
Full Paper

IJCST/43/4/C-1719
    128 Consumer Emotion and Behavioral Tracking Using Fuzzy logicL.Papu Naidu, A.V.D.N.Murthy, P.Suresh Babu

Abstract

Analyzing consumers buying behavior with their emotion is a recent challenging area of research. Every consumer wants survey about the product before they buy. We propose a semantic web usage mining approach for discovering periodic web access patterns from different web usage logs which provides information on consumer emotions and behaviors through self-reporting and behavioral tracking. We use fuzzy logic to represent real-life temporal concepts and ontological domain concepts for the requested URLs of periodic pattern based web access activities. These pattern are treated as the personal behavioral and emotional data which can be studied further. Finally, with rigorous experiment we found the effectiveness of the method and proved that emotional influence has been found to contribute positively to adaptation in personalized recommendation.
Full Paper

IJCST/43/4/C-1720
    129 Scalable and Reliable Faith Aware Path in Wireless Sensor NetworkCh.Gowthami, A.Lakshman Rao

Abstract

Wireless Sensor Networks (WSN) is an emerging technology and have great potential to be employed in critical situations like battlefields and commercial applications such as building, traffic surveillance, habitat monitoring and smart homes and many more scenarios. One of the major challenges wireless sensor networks face today is security. In this paper, we propose a trust and energyaware, location-based routing protocol called Trust and Energy aware Routing (TER) protocol. TER uses trust values, energy levels and location information in order to determine the best paths towards a destination. The protocol achieves balancing of traffic load and energy, and generates trustworthy paths when taking into consideration all proposed metrics. Other metrics can be easily integrated in the protocol.
Full Paper

IJCST/43/4/C-1721
    130 Layered Conditional Random Fields for Security Management System in NetworksM.Rani, Syed.Shanvaz

Abstract

Intrusion Detection (ID) is a type of security management system for computers and networks. Intrusion detection faces a number of challenges; system must reliably detect malicious activities in a network and must perform efficiently to cope with the large amount of network traffic. Here, we address these two issues of Accuracy and Efficiency using Conditional Random Fields and Layered Approach. We demonstrate that high attack detection accuracy can be achieved by using Conditional Random Fields and high efficiency by implementing the Layered Approach. Our proposed system based on Layered Conditional Random Fields outperforms other well-known methods such as the decision trees and the naive Bayes. The development in attack detection accuracy is very high for the U2R attacks and the R2L attacks. Statistical Tests also demonstrate higher confidence in detection accuracy for our method. We show that our system is robust and is able to handle noisy data without compromising performance finally.
Full Paper

IJCST/43/4/C-1722
    131 Optimizing Strategies for Multichannel MAC ProtocolVaka Padmavathi, Chepuru SubbaRao

Abstract

Distributed Information Sharing (DISH) is a new collaborative approach to designing multichannel MAC protocols. It aids nodes in their decision making processes by compensating for their missing information via information sharing through neighboring nodes. This approach was recently shown to significantly boost the throughput of multichannel MAC protocols. However, a critical issue for ad hoc communication devices, viz. energy efficiency, has yet to be addressed. In this paper, we address this issue by developing simple solutions that reduce the energy consumption without compromising the throughput performance and meanwhile maximize cost efficiency. We propose two Optimization strategies:in-situ energy conscious DISH, which uses existing nodes only, and altruistic DISH, which requires additional nodes called altruists. On the other hand, our study also shows that in-situ energy conscious DISH is suitable only in certain limited scenarios.
Full Paper

IJCST/43/4/C-1723
    132 Prediction of Sensor Nodes Energy Level in Unequal Clustering for WSNG. Vennira Selvi, R. Manoharan

Abstract

Sensors in a wireless sensor networks are prone to failure due to the deployment of WSNs in dynamic and unpredictable harsh environment. To prevent the node from failure and extend the network lifetime, we propose a probabilistic energy prediction unequal clustering algorithm for wireless sensor networks. In this algorithm, the network is partitioned into unequal clusters and predicts the energy level of each sensor nodes using the probabilistic model to prevent from failure. The simulation results show that our proposed algorithm achieved our goal efficiently.
Full Paper

IJCST/43/4/C-1724
    133 Research Issues Related to Cache Mechanism in Mobile DevicesJosyula Siva PhaniRam, Dr. G. PardhaSaradhi Varma, Dr. Y.K. Sundara Krishna

Abstract

In Mobile computing environment, Caching plays a vital role because of its ability to improve the performance and availability. Limitations of weakly-connected and disconnected operations are often addressed by the researchers with various cache mechanisms. It is known fact that the mobile constraints are the major hurdles for adopting the various cache replacement strategies. But still there is a dire need to address these strategies to increase the transparency, performance of the mobile clients. In this paper, an emphasis is made on overview of the cache mechanism along with cache replacement strategies based on Location Dependent Information Services (LDIS).
Full Paper

IJCST/43/4/C-1725
    134 Managing Selfishness in Duplication Provision Over a MANETS S V Rama Krishna Kumar, D. Srinivas

Abstract

In a mobile ad hoc network, the mobility and resource limitations of movable nodes may guide to network separation or performance deprivation. Numerous data duplication techniques have been projected to reduce performance degradation. Most of them assume that all mobile nodes work together fully in terms of distributing their memory space. In reality, though, some nodes may selfishly decide only to oblige partly, or not at all, with other nodes. These selfish nodes could then shrink the overall data convenience in the network. In this paper, we examine the impact of selfish nodes in a mobile ad hoc network from the viewpoint of duplication provision. We term this selfish duplication provision. In particular, we develop a selfish node detection algorithm that considers partial selfishness and new duplication provision techniques to properly handle with selfish duplication provision. The conduct simulations demonstrate the proposed approach outperforms conventional cooperative duplication provision techniques in terms of data accessibility, communication cost, and average query disruption.
Full Paper

IJCST/43/4/C-1726
    135 Analysis of Thyroid Disease Using Backpropagation AlgorithmGurpreet Kaur

Abstract

Artificial neural networks provide a powerful tool to help doctors to analyze, model and make sense of complex clinical data across a broad range of medical applications. One of the major problems in medical life is setting the diagnosis. A lot of applications tried to help human experts, offering a solution. This paper describes how neural networks can improve this domain. Thyroid problems are the most prevalent problems nowadays. In this paper an artificial neural network approach is developed using a back propagation algorithm in order to diagnose thyroid problems. It gets a number of factors as input and produces an output which gives the result of whether a person has the problem or is healthy. It is found that back propagation algorithm is proved to be having high sensitivity and specificity.
Full Paper

IJCST/43/4/C-1727
    136 A Framework for Focused Image Crawler and Face HarvesterShatashi Bansal, Abhinav Goel, Manav Bansal

Abstract

The World Wide Web is a global, read-write information space loaded with Text documents, images, multimedia and many other items of information. Search Engines are important tools of information gathering from World Wide Web, if information is in the form of picture than it plays a major role to take prompt action and easy to use. There is a human tendency to retain more images than text. This paper presents a development of a image crawler that will focus on gathering images from the World Wide Web and identifies and extract Human Faces and store them for social uses such as making Face databases .and interests specific human face collections. To achieve Image crawling, we designed two crawler modules: a Web Crawler that searches the relevant web URL and a Face Detector that identifies the faces, crop it and stores them in a particular directory.
Full Paper

IJCST/43/4/C-1728
    137 Ensuring Distributed Accountability for Data Sharing in the CloudMannava Sujatha, GVNKV Subba Rao

Abstract

Cloud computing enables highly scalable services to be easily consumed over the Internet on an as-needed basis. A major feature of the cloud services is that users’ data are usually processed remotely in unknown machines that users do not own or operate. While enjoying the convenience brought by this new emerging technology, users’ fears of losing control of their own data (particularly, financial and health data) can become a significant barrier to the wide adoption of cloud services. In particular, we propose an object-cantered approach that enables enclosing our logging mechanism together with users’ data and policies. We leverage the JAR programmable capabilities to both create a dynamic and travelling object, and to ensure that any access to users’ data will trigger authentication and automated logging local to the JARs. To strengthen user’s control, we also provide distributed auditing mechanisms. We provide extensive experimental studies that demonstrate the efficiency and effectiveness of the proposed approaches.
Full Paper

IJCST/43/4/C-1729
    138 Slicing: A New Approach to Privacy Preserving Data PublishingD. Prathibha, GVNKV Subba Rao

Abstract

Several anonymization techniques, such as generalization and bucketization, have been designed for privacy preserving microdata publishing. Recent work has shown that generalization loses considerable amount of information, especially for highdimensional data. Bucketization, on the other hand, does not prevent membership disclosure and does not apply for data that do not have a clear separation between quasi-identifying attributes and sensitive attributes. In this paper, we present a novel technique called slicing, which partitions the data both horizontally and vertically. We show that slicing preserves better data utility than generalization and can be used for membership disclosure protection. Another important advantage of slicing is that it can handle high-dimensional data. We show how slicing can be used for attribute disclosure protection and develop an efficient algorithm for computing the sliced data that obey the ℓ-diversity requirement. Our workload experiments confirm that slicing preserves better utility than generalization and is more effective than bucketization in workloads involving the sensitive attribute. Our experiments also demonstrate that slicing can be used to prevent membership disclosure.
Full Paper

IJCST/43/4/C-1730
    139 Effectiveness of various supervised classification algorithms in Land use Land cover classificationShivakumar G.S, Dr.S.Natarajan, Dr.K SrikantaMurthy

Abstract

Land use Land cover classification of urban and semi urban areas require efficient algorithms to classify high resolution satellite image data. In supervised classification defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. This paper reviews and examines five supervised classification algorithms namely : Maximum likelihood Classifier, Multinomial Logistic Regression, Neural Network ,Random forest and Support Vector Machines. Finally an accuracy assessment of various algorithms is made in the form of user and producer accuracies, the overall accuracy, the confusion matrix and the kappa coefficient.
Full Paper

IJCST/43/4/C-1731