IJCST Logo

 

INTERNATIONAL JOURNAL OF COMPUTER SCIENCE & TECHNOLOGY (IJCST)-VOL IV ISSUE II, VER. 4, APR. TO JUNE, 2013


International Journal of Computer Science and Technology Vol. 4 Issue 2, Ver. 4
S.No. Research Topic Paper ID
130 Computer Forensics Considerations and Tool Selection Within an Organization

Felex Madzikanda, Talent Musiiwa, Washington Mtembo

Abstract

Implementation of computer forensics and tool selection in an organization can be a complex process. Workforce computer misuse may lead to the rise of setting up computer forensics at the workplace which further leads to the tool selection process. A range of factors need to be considered during the setup and tool selection process within an organization(e.g. Privacy, Open source versus commercial tools, data protection Act, ethical, policies and legal issues.) We will identify the factors that need to be considered in setting up computer forensics at the workplace. We have used the grid analysis as a decision making tool to aid us in the tool selection process. We managed to note how vital it was to use the grid analysis in an unbiased environment and how it can also be manipulated to attain desired results.
Full Paper

IJCST/42/4/B-1549
131 Evaluating Student’s Performance Using k-Means Clustering

Rakesh Kumar Arora, Dr. Dharmendra Badal

Abstract

The student’s performance plays important role in success of any institution. With the significant increase in number of students and institutions, institutions are becoming increasingly performance oriented and are accordingly setting goals and developing strategies for their achievements. A system to analyze the performance of students using k-means clustering algorithm coupled with deterministic model is being described in this paper. The result of analysis will assist the academic planners in evaluating the performance of students during specific semester and steps that need to be taken to improve students’ performance from next batch onwards.
Full Paper

IJCST/42/4/B-1550
132 Critical Issues in Software Testing During Agile Development

P. Rajasekhar, Dr. Arabind Yadav

Abstract

Software testing is the most important process to verify the quality of a product. Software testing in Agile development is very complex and controversial issue in literature and industry. Different people have different views about software testing in Agile development, because most of Agile development do not focus much on software testing activities. Agile development strongly focuses on the close customer collaboration, short iterations and frequent deliveries. But when it comes to software testing, then it is challenging, as Agile development do not include many destructive testing practices, which are normally required for a quality product. This paper covers the area of software testing process in Agile development. Agile development processes could be more beneficial and refined by adding testing practices and for this purpose. This paper identifies the practices of Agile development in industry and the critical issues in industry while practicing Agile development. The issues of automated and manual testing, good practices in automation, and how to manage independent testing teams in Agile development are also high lightened. This paper highlights every aspect of software testing process in Agile development. This paper is based on literature reviews and an industrial survey.
Full Paper

IJCST/42/4/B-1551
133 Performance Evaluation of Routing Algorithms for Multi-Cell Scalable Architecture of Integrated WiMAX-WiFi Heterogeneous Network

S.V.S. Rama Krishnam Raju, Dr. B. C. Jinaga, Dr. Kahkashan Tabassum, Dr. Damodaram A

Abstract

A wireless communication network architecture with integrated WiMAX-WiFi broadband technologies is a potential, fast and economical method of providing wireless access services to huge number of users. This network has multi-hop and multitier design. To offer uninterrupted and seamless service to all subscribers in a wireless network we can incorporate a low cost, flexible heterogeneous network which not only allows any kind of network for efficient spectrum utilization but also improves the network capacity. This set-up provides easy deployment based data communication based on broad range of coverage with required high throughput, high speed data rate, low end to end delay and low jitter. We use Wi-Fi (IEEE 802.11n) & WiMAX (IEEE 802.16e) in our simulation since they support mobility The choice of appropriate routing protocol for high speed seamless applications plays a vital role in the design of a scalable, flexible and efficient integrated network. The selected routing protocol is responsible for the mobility factor and the topology within the network. Therefore in this paper we designed an integrated containing Wi-Fi/WiMAX and then analyze the performance of the network by taking into account various routing protocols. This analysis allows the user to select the best suitable routing protocol.
Full Paper

IJCST/42/4/B-1552
134 A Comparative Study Between Genetic Algorithm (Ga) and Particle Swarm Optimization (Pso)

Prasannajit Dash, Dr.Maya Nayak, Deepak Kumar

Abstract

The most and major popular technique in evolutionary computation research has been the genetic algorithm. Mostly in the Genetic Algorithm(GA), the representation used is a fixed-length bit string. Each position in the string represents a particular feature of an individual and the particular value stored in that corresponding position. Particle Swarm Optimization(PSO) is used to find the optimal fitness value.Simulations are performed over the various standard test data and comparisions are performed with Genetic Algorithm(GA). The experimental results show that proposed PSO based method performs better than the GA method.
Full Paper

IJCST/42/4/B-1553
135 An Approach for Fast Execution of Ad-Hoc Queries in Massive Data using Indexing and Off-line Analysis

Vaibhav Jain, Mangesh Wanjari

Abstract

Nowadays there is growing demand for massive data [1] processing and analysis applications. Accessing this big data may take large time depending upon the query fired. It is also observed that about 90% of the queries are fired on about 10% of the data. So there should be a mechanism for handling this 10% of data and the respective queries. Semantic analysis is used to match the queries. Processing this big data also consumes time. Every time a query is fired, the information is retrieved from the big data. If again the same query is fired, the same procedure gets repeated. One approach is to store these results temporarily into some other data base, and later access this data base to fetch the results just prior to main query execution if some similar query is fired. We will provide an approach for fast execution of such queries to retrieve the useful data with the use of Semantic processing and Indexing.
Full Paper

IJCST/42/4/B-1554
136 Comparison of Feature Extraction Technique Used for Isolated Word STTD System

Virender Kadyan, Ashish Chopra

Abstract

In modern speech recognition systems, there are a set of Feature Extraction Techniques (FET) like Mel-frequency cepstral coefficients (MFCC) or perceptual linear prediction coefficients (PLP) are mainly used. As compared to the conventional FET like LPCC etc, these approaches are provide a better speech signal that contains the relevant information of the speech signal uttered by the speaker during training and testing of the Speech To Text Detection System (STTDS) for different Indian languages. In this paper variation in the parameters values of these FET’s like MFCC, PLP are varied at the front end along with dynamic HMM topology at the back end and then the speech signals produce by these techniques are analysed using HTK toolkit. This paper also provided a review of the current state-of-the art & the recent research performed in pursuit of these goals. The cornerstone of all the current state-of-the-art STTDS is the use of HMM acoustic models. In our work the effectiveness of proposed FET(MFCC, PLP features) are tested and the comparison is done among the FET like MFCC and PLP acoustic features to extract the relevant information about what is being spoken from the audio signal and experimental results are computed with varying HMM topology at the back end.
Full Paper

IJCST/42/4/B-1555
137 Devanagari Script Conjunct Characters Segmentation Based on Character Structural Properties by Horizontal Projection

Ankur Kumar Aggarwal, Aman Kumar Aggarwal

Abstract

Optical character recognition is used widely for generating digital computer data from printed or handwritten text document. Devanagari script is basis for many Indian languages. A lot of work has been completed in optical character recognition of Devanagari script. Devanagari script consists of numerous fundamental characters like partial consonants, full consonants or character, vowel modifiers, and diacritics. Segmentation of touching or fused characters of Devanagari script is sometimes difficult due to interline space or overlapping and noise. The algorithm used here is for segmentation of fused Devanagari characters into its constituent partial or full consonants. The structural properties of Devanagari script is used with the consonant or conjunct characters height and width for finding the best point of segmentation of fused characters.
Full Paper

IJCST/42/4/B-1556
138 Component Retrieval Using Genetic Algorithm Based Optimization Technique

Kamna Mahajan, Mandeep Kaur

Abstract

Reusable components stored in a repository are useful in developing early System with better quality. One of the most fundamental problems of retrieving software components from a large repository. To reuse a software component, you first have to find it with help of Genetic Algorithm based on Optimization technique. Retrieval of component should be less time consuming and efficient. Genetic Algorithms and Optimization technique is used for finding the best component. Genetic algorithms first short list the components and these components are then refined using Ant colony optimization. Problems such as selecting and retrieving the best component from a repository can be solved
by ant colony.
Full Paper

IJCST/42/4/B-1557
139 Power Gating Mehtods: Comparative Study

B.Prasanna Jyothi, D.Sunil Suresh

Abstract

In this paper we discussing about the comparative study of the power gating tecqniques used in integrated circuits to reduce the power consumption.
Full Paper

IJCST/42/4/B-1558
140 Fabrics Fault Processing Using Image Processing Technique in MATLAB

Jagruti Mahure, Y.C.Kulkarni

Abstract

The main objective of this paper is the processing of the defective fabric parts. In Textile industry automatic fabric inspection is important to maintain the quality of fabric. . This paper proposes an approach to recognize fabric defects in textile industry for minimizing production cost and time since the work of inspectors is very tedious and consumes time and cost. Wastage reduction through accurate and early Stage detection of defects in fabrics is also an important aspect of quality improvement. The recognizer acquires digital fabric images by image acquisition device and converts that image into binary image by restoration and threshold techniques
Full Paper

IJCST/42/4/B-1559
141 Performance Evaluation of a New AES Based Image Encryption Technique

Chandra Prakash Dewangan, Shashikant Agrawal

Abstract

In the era of computer and internet technology, multimedia protection is becoming increasingly jeopardized. Therefore numerous ways of protecting information are being utilized by individuals, businesses and governments. This paper mainly focuses on a new AES a based image encryption technique using binary code which is expected to provide safety of images travelling over the internet. In order to evaluate the performance, the proposed image encryption technique was measured through a series of tests. These tests included a histogram analysis, information entropy, correlation analysis and differential analysis. Experimental results showed that the proposed encryption technique has satisfactorysecurity and is more efficient than using the AES algorithm alone which makes it a good technique for the encryption of multimedia data. The results showed that the histogram of an encrypted image produced a uniform distribution, which is very different from the histogram of the plain image, and the correlation among image pixels was significantly decreased by using the proposed encryption technique and higher entropy was achieved.
Full Paper

IJCST/42/4/B-1560
142 Survey on Improved DBSCAN Algorithm

Chetan Dharni, Meenakshi Bansal

Abstract

Clustering is one of the most important techniques in data mining. It extracts knowledge from large database. The basic aim of clustering is to organize similar objects into a same cluster and dissimilar to different cluster. As when clusters are being of widely different shapes, densities and sizes, finding clusters in data becomes a challenging task. Last two Decades, different types of clustering algorithms were proposed to resolve this problem but among all these density based clustering methods are considered more effective. DBSCAN algorithm is a one of the important density based algorithm to detect clusters of arbitrary shapes and they will also eliminate the noise and outlier which are present in it. So the aim is to improve the existing DBSCAN algorithm by directly selecting the input parameters and to find the density varied clusters. DBSCAN algorithm requires only two input parameters and is very effective for analyzing large and complex databases. In this paper, we have briefly described the density based methods and then compared from a theoretical view. Finally, we have given some suggestions for the improvement of the algorithm and the future work.
Full Paper

IJCST/42/4/B-1561
143 A Review on Evaluation of Binarization Techniques on Camera-Based Images

Ruchika Sharma, Balwinder Singh

Abstract

This paper presents a review of techniques of binarization on camera images. Several algorithms have previously been proposed for improving the thresholding of degraded document images. No algorithm can solve all types of problems, but some algorithms are better than others for specific situations. This article reviews local binarization algorithms and global binarization algorithms for improving binarization of camera images, thus indicating evaluation of different algorithms and finding the better result. This served as the major contribution of this paper.
Full Paper

IJCST/42/4/B-1562
144 Employing HASBE Scheme for Assisting Access Controls in Out-Sourced Data Clouds

Darwin.V.Tomy, Dhanalakshmi.S, Karthik.S

Abstract

Cloud computing has appeared as one of the most leading standards in the IT engineering in past decades. Since this innovative computing technology requires users to deliver their valuable data to cloud service providers, there have been growing security and privacy concerns on data from outside supplier. Several schemes employing attribute-based encryption (ABE) have been suggested for access control of outsourced data in cloud computing; however, most of them suffer from rigidity in applying complex access control strategies. As the cloud uses virtualization in back end all the methods implemented in real platforms can also be implemented in cloud. In several scattered systems a user should only be able to access data if a user holds a certain set of passes or attributes. Recently, the only method for enforcing such policies is to employ a trusted server to store the data and intermediate access control. However, if safety of server storing the data is compromised, then the secrecy of the data will be exposed.
Full Paper

IJCST/42/4/B-1563
145 A lightweight Multi-tier Authentication Method in Cloud Computing

Mayank Gaurav, Jasvinder Pal Singh, Gaurav Shrivastav

Abstract

Cloud computing has great potential of providing robust computational power to the society at reduced cost. It enables customers with limited computational resources to outsource their large computation workloads to the cloud, and economically enjoy the massive computational power, bandwidth, storage, and even appropriate software that can be shared in a pay-per-use manner. Cloud storage moves the user’s data to large data centers, which are remotely located, on which user does not have any control. This unique feature of the cloud poses many new security challenges which need to be clearly understood and resolved. A key area in security research is authentication. Authentication means to check the identity of the user, which means whether the person is same which he pretends to be. This paperpresents overview of different authentication methods and then proposes a scheme in which authentication process is carried out in two levels or two tiers. The advantage of this work is that it does not require any additional hardware and software. So this can be used and accessed from anywhere across the globe
Full Paper

IJCST/42/4/B-1564
146 Computer Trend with Security by RSA, DES and BLOWFISH Algorithm

Vaibhav Shrivastava, Gurpal Singh

Abstract

There are some applications going very fast in communication like internet and network application. Security is the most challenging aspects in the internet and network applications. Implementations of three encryption algorithms are shown in this paper. The selected algorithms are: DES algorithm, BLOWFISH algorithm and RSA algorithm. The most useful programming language JCA (Java Cryptography Architecture) and Java is used in implementing the algorithms.
Full Paper

IJCST/42/4/B-1565
147 A Survey On Intrusion Detection System with Similar Alarm Accumulation and Notification

Anoop Shankar, S S Jadhav

Abstract

A single intrusive attack instance might often spread over many network connections or log file entries and may create thousands of alarms for same single attack instance . At present, most Intrusion Detection Systems are quite reliable in detecting suspicious actions by evaluating TCP/IP connections or log file. Once an Intrusion Detection System finds a suspicious event, it immediately generates an alarm which contains information about the source, target, and the type of the attack(SQL injection or buffer overflow) . IDS usually focus on detecting attack types, but not on differentiating attack instances. Even low rates of false alerts could easily result in a high total number of false alerts. Here alert aggregation can be the main subtask of intrusion detection system and the main goal is to identify and cluster different alerts originate from lowlevel IDS such as from firewalls (FW), etc. Alarms that belong to one attack instance must be clustered together and meta-alerts must be generated for these clusters. Here , we suggest a unique technique for alert accumulation and aggregation on network which is based on a dynamic as well as probabilistic model of the current attack situation.
Full Paper

IJCST/42/4/B-1566
148 A Review on Email Spam in Data Mining

Amandeep Singh, Meenakshi Bansal

Abstract

In the recent years spam became as a big problem of Internet and electronic communication. Spam is the use of electronic messaging systems (including most broadcast media, digital delivery systems) to send unsolicited bulk messages indiscriminately Taxonomists, social scientists, psychologists, biologists, statisticians, mathematicians, engineers, computer scientists, medical researchers, and others who collect and process real data have all contributed to clustering methodology. With the increasing popularity of a E-mail users, E-mail spam problem growing proportionally. Traditional anti spamming methods filter spam emails and prevent them from entering the inbox but take no measure to trace spammers.
Full Paper

IJCST/42/4/B-1567
149 A Survey on Unauthorized AP Detection in WLAN by Measuring DNS RTT

M.K.Nivangune, Sandeep Vanjale, Mousami Vanjale

Abstract

The most serious network security problem for network administrators is the presence of unauthorized access point, in another word Rogue access points, if it remains undetected, then it is the best way for an attacker to break into the system through wireless Local Area Network [WLAN]. so many intruders have taken advantage of illegal access points in enterprises to not only get free Internet access, but also to view confidential information. Upto now so many current solutions are their to detect unauthorized access points but these approaches are rudimentary and are easily evaded by intruders. In this paper we propose solution to detect unauthorized access points measuring DNS RTT.
Full Paper

IJCST/42/4/B-1568
150 Efficiently Providing Data Security and Linear Programming in Cloud Computing

S.Kranthi Kumar, S.Venkateswarlu

Abstract

The data protection in cloud platform architecture dramatically reduces the per-application development effort required to offer data protection while still allowing rapid development and maintenance. Although cloud computing promises lower costs, rapid scaling, easier maintenance, and service availability anywhere, anytime, a key challenge is how to ensure and build confidence that the cloud can handle user data securely. A recent Microsoft survey found that “58 percent of the public and 86 per-cent of business leaders are excited about the possibilities of cloud computing. But more than 90 percent of them are worried about security, availability, and privacy of their data as it rests in the cloud. In this manner most of the users they won’t trust the cloud providers for storing their information in the cloud regarding security considerations , most of the security techniques are compromised such as third party auditing.., etc we proposed a new technique for Fully homomorphic encryption scheme (FHE) has numerous applications, especially in the cloud computation. Recently, FHE has been the focus of extensive study, but mostly were researching into the construction and efficiency of the scheme, a fat lot focus on the application of the scheme. In this paper, we present a new system for searching on encrypted data which combined ABE (Attribute based Encryption) and FHE. Our system enables anyone even without private-key of the encrypted data to search the data. In the end, we discuss the application of FHE on outsourcing of computation and present two different systems matched the different requirement.
Full Paper

IJCST/42/4/B-1569
151 Use of Data Mining in Enhancing IDS Based Security

Savyasachi, Rajneesh Agrawal, Sandeep Sahu

Abstract

An important problem in intrusion detection is how effectively can separate the attack patterns and normal data patterns from a large number of network data and how effectively generate automatic intrusion rules after collected raw network data. To accomplish this, various data mining techniques are used such as classification, clustering, association rule mining etc. Examples for Data Mining based Misuse detection model of IDS are JAM (Java Agents for Meta-learning), MADAM ID (Mining Audit Data for Automated Models for Intrusion Detection), and Automated Discovery of Concise Predictive Rules for Intrusion Detection. Ant clustering technique in data mining is a novel approach which uses Ants technique to find the relevant information and put them in various clusters. Since several Ants work in parallel therefore the processing speed of the system is high and in case of large data sets it is worth using Ant clustering to apply. This paper proposes to perform mining on the data collected from the IDS to enhance the speed of detection of intrusion with automatic detection using specific attributes of the intrusions. Various phases of the proposed work perform data collection, cleaning, clustering, detection and alarming system etc.
Full Paper

IJCST/42/4/B-1570
152 A Fuzzy Forensic Analysis System for DDoS Attack in MANET

Sarah Ahmed, S.M. Nirkhi

Abstract

Mobile Ad-Hoc Network (MANET) is a distributed wireless communication network that comprises, wireless mobile nodes that dynamically self organize into ad hoc topologies. In, MANET the nodes in network can seamlessly interconnect with each other without pre-existing infrastructure. MANET feature make it scalable, as well as chances of security threats increases. As, in MANET the nodes in network can dynamically connect make it scalable, but the scope that malicious node may enter in the normal working network is increased. An easy to launch attack is the denial of services (DoS), in which attacker paralyses the target network when coordinated by group of attackers is considered as distributed denial of services (DDoS). DoS attack caused by flooding excessive volume of traffic to deplete key resources of the target network, need not require special capabilities. Dynamic nature of MANET calls for self route management routing protocol like DSR. DoS/DDoS attacks at discovery phase of DSR to discover the route could be launched by attackers/malicious node by flooding the route request message (RREQ) causing damage to normal network for some duration of time. When an attack on the target system is successful enough to crash or disrupt, this event as the breach, triggers investigation. Forensic investigation and analysis provide source of digital evidence. There is a quest to answer the question related security breach and requirement to provide the proof against the malicious activity & for this network forensic is done and forensic analysis system tool is required. Flooding RREQ violating broadcasting rules can be recognizable, but if done intelligently is difficult to recognize. So, for forensic analysis there is a need of intelligent tool. In this paper, we elaborated over a fuzzy forensic analysis system.
Full Paper

IJCST/42/4/B-1571
153 Effect on Throughput-Delay With Various Routing Protocols in IEEE 802.11Wireless Ad-Hoc Networks

Mahendra Kumar, A. K. Jain

Abstract

Wireless Ad-hoc networks have a dynamic nature that leads to constant changes in their topology. Therefore, there is need for robust dynamic routing protocol which can face the challenges of frequently changes topology. This article presents effect of throughput-delay on wireless Ad-hoc network with different routing protocols. Network simulator QualNet 5.0.2 has been used to evaluate the performance of wireless networks with various routing protocols.
Full Paper

IJCST/42/4/B-1572
154 Review on Software Complexity Measurement and Recitation

Sudhir Kumar Singh

Abstract

Software complexity is widely regarded as an important determinant of software maintenance costs. Increased software complexity means that maintenance and enhancement projects will take longer, will cost more, and will result in more errors. In this paper we review the Software Complexity Measurement and Recitation
Full Paper

IJCST/42/4/B-1573
155 Database Primitives, Algorithms and Implementation Procedure: A Study on Spatial Data Mining

J Rajanikanth, Dr. T.V. Rajinikanth

Abstract

Spatial data mining means extraction of useful knowledge from large amounts of spatial data. It’s highly demanding field because huge amounts of spatial data have been collected in various applications, ranging from remote sensing to geographical information system, computer cartography; environmental assessment and planning etc. The huge explorative growth of collected data pose challenges to the research community in terms of ability to analyze. Recent studies on data mining have extended the scope of data mining from traditional based relational and transactional databases to spatial databases. This study provides data mining primitives, typical algorithms, profile procedures and their performances under various situations.
Full Paper

IJCST/42/4/B-1574
156 Mutation Operators corresponding Conditions Contributing in Deporting them Equivalently

Tannu Singla, Ajay Kumar

Abstract

Mutation Testing is one of the most dexterous testing techniques in retracing the faults. In order to evaluate the exact mutation score in mutation testing, vital question is whether a mutant is equivalent to its program. Unfortunately, answer to this question is not always possible. In this paper, we are introducing mutation operators and conditions that deport the mutant to behave as equivalently. Based on the specific criteria, detection of equivalent mutants of the program becomes ingenious. They are utilitarian in calculating the mutation score of the program accurately
Full Paper

IJCST/42/4/B-1575
157 Performance and Security Analysis of Improved Hsu Et. Al’s Multi Proxy Multi Signature Scheme With Shared Verification

Ashwani Malhotra, Raman Kumar

Abstract

A proxy is basically another computer which serves as a hub through which internet requests are processed. By connecting through one of these servers, computer sends your requests to the proxy server which then processes the request and returns the result. A proxy signature allows a delegator to give partial signing rights to other parties called proxy signers on its behalf for example in the case of temporal absence, lack of time or computational power, etc. A multi proxy multi signature represents a certain number of proxy signers signing a given message. Number of signers is not fixed and signer’s identities are evident from a given multi-signature the delegated proxy signer can compute a proxy signature that can be verified by anyone with access to the original signer’s certified public key. This work introduces a scheme which violates the claimed security requirements of proxy protection and unforgeability. Further, we have proposed a new efficient and secure non-repudiable multi-proxy multi-signature scheme with shared verification with analysis of various attacks forgery, black hole, worm hole, Denial of service and jamming attack.
Full Paper

IJCST/42/4/B-1576
158 Preventing Jamming Attacks in Wireless Sensor Networks Through Cryptography

Nimmagadda.Hemanth Kumar, Dr. P. Bala Krishna Prasad, G. Guru Kesava Dasu

Abstract

Understanding and defending against jamming attacks has long been a problem of interest in wireless communication and radar systems. In wireless ad hoc and sensor networks using multihop communication, the effects of jamming at the physical layer resonate into the higher layer protocols, for example by increasing collisions and contention at the MAC layer, interfering with route discovery at the network layer, increasing latency and impacting rate control at the transport layer, and halting or freezing at the application layer. Adversaries that are aware of higher-layer functionality can leverage any available information to improve the impact or reduce the resource requirement for attack success. For example, jammers can synchronize their attacks with MAC protocol steps, focus attacks in specific geographic locations, or target packets from specific applications. In this work, we address the problem of selective jamming attacks in wireless networks. In these attacks, the adversary is active only for a short period of time, selectively targeting messages of high importance. We illustrate the advantages of selective jamming in terms of network performance degradation and adversary effort by presenting two case studies; a selective attack on TCP and one on routing. We show that selective jamming attacks can be launched by performing real-time packet classification at the physical layer. To mitigate these attacks, we develop three schemes that prevent real-time packet classification by combining cryptographic primitives with physical-layer attributes. We analyse the security of our methods By Adding Public Key Encryption algorithms (e.g., RSA) and evaluate their computational and communication overhead.
Full Paper

IJCST/42/4/B-1577
159 Analysis and Design of an Algorithm Using Data Mining Techniques for Matching and Predicting Crime

Anshu Sharma, Raman Kumar

Abstract

Crime analysis uses past crime data to predict future crime locations and times. Criminology is an area that focuses the scientific study of crime and criminal behavior. It is a process that aims to identify crime characteristics. It is one of the most important fields where the applications of data mining techniques can produce important results. The exponentially increasing amounts of data being generated each year make getting useful information from that data more and more critical.. Analysis of the data includes simple query and reporting, statistical analysis, more complex multidimensional analysis, and data mining. The wide range of data mining applications has made it an important field of research. Criminology is one of the most important fields for applying data mining. Criminology is a process that aims to identify crime patterns. The high volume of crime datasets and also the complexity of relationships between these kinds of data have made criminology an appropriate field for applying data mining techniques. An approach based on data mining techniques is discussed in this paper to extract important patterns from reports gathered from the city police department. The reports are written in simple plain text. The plain texts are converted into the format understandable by the tool. Then, exiting data mining techniques are applied to get patterns of crime data and a new algorithm is proposed to improve the accuracy of the crime pattern detection system. The various data mining techniques such as clustering and classification are used to get the patterns of crime data. This paper presents a new algorithm for K-Means using weighted approach. The results of proposed algorithm are compared with existing K-means clustering algorithm. The weighted approach proves to be better approach than existing K-means.
Full Paper

IJCST/42/4/B-1578
160 Modified Icon Search from the Picture Allocation Websites

K. Rama Devi, B.Prashant

Abstract

Web search engines help users find useful information on the World Wide Web (WWW). However, when the same query is submitted by different users, typical search engines return the same result regardless of who submitted the query. Users are increasingly pursuing complex task oriented goals on the Web, such as making travel arrangements, managing finances or planning purchases. Searchers create and use external records of their actions and the corresponding results by writing/typing notes, using copy and paste functions, and making printouts. The social media sites, such as Flickr and del.icio.us, allow users to upload content and annotate it with descriptive labels known as tags, join specialinterest groups, etc. We believe user-generated metadata expresses user’s tastes and interests and can be used to modified information to an individual user. Specifically, we describe a machine learning method that analyzes a corpus of tagged content to find hidden topics. We then these learned topics to select content that matches user’s interests. We empirically validated this approach on the social picture-allocation site Flickr, which allows users to annotate icon s with freely chosen tags and to search for icon s labeled with a certain tag. We use metadata associated with icon s tagged with an ambiguous query term to identify topics corresponding to different senses of the term, and then modified results of icon search by displaying to the user only those icon s that are of interest to her.
Full Paper

IJCST/42/4/B-1579
161 Scalable and Secure Third Party Auditing in Cloud Computing

N.Samatha, Dr. P. Bala Krishna Prasad, G. Guru Kesava Das

Abstract

Cloud computing is the newest term for the ong- dreamed vision of computing as a utility. The cloud provides convenient, on-demand network access to a centralized pool of configurable computing resources that can be rapidly deployed with great efficiency and minimal management overhead. The industry leaders and customers have wide-ranging expectations for cloud computing in which security concerns remain a major aspect. Actually the application software and databases to the centralized large data centers in a cloud .The management of the data and services may not be fully trustworthy. This unique paradigm brings about many new security challenges, which have not been well understood. This work studies the problem of ensuring the integrity of data storage in Cloud Computing. In particular, we consider the task of allowing a third party auditor (TPA), on behalf of the cloud client, to verify the integrity of the dynamic data stored in the cloud. The introduction of TPA eliminates the involvement of the client through the auditing of whether his data stored in the cloud is indeed intact, which can be important in achieving economies of scale for Cloud Computing. The support for data dynamics via the most general forms of data operation, such as block modification, insertion and deletion, is also a significant step toward practicality, since services in Cloud Computing are not limited to archive or backup data only. While prior works on ensuring remote data integrity often lacks the support of either public auditability or dynamic data operations, this paper achieves both. We first identify the difficulties and potential security problems of direct extensions with fully dynamic data updates from prior works and then show how to construct an elegant verification scheme for the seamless integration of these two salient features in our protocol design. In particular, to achieve efficient data dynamics, we improve the existing proof of storage models by manipulating the classic Merkle Hash Tree construction for block tag authentication. To support efficient handling of multiple auditing tasks, we further explore the technique of bilinear aggregate signature to extend our main result into a multi-user setting, where TPA can perform multiple auditing tasks simultaneously.
Full Paper

IJCST/42/4/B-1580
162 Protection Against Significant Online Attacks on Login Password

Srinivasa Rao.Morla, Padavala Sai Prasad

Abstract

Users are typically authenticated by their passwords. Because people are known to choose convenient passwords, which tend to be easy to guess, authentication protocols have been developed that protect user passwords from guessing attacks. Brute force and dictionary attacks on password-only remote login services are now widespread and ever increasing. Enabling convenient login for legitimate users while preventing such attacks is a difficult problem. Automated Turing Tests (ATTs) continue to be an effective, easy-to-deploy approach to identify automated malicious login attempts with reasonable cost of inconvenience to users. These proposed protocols, however, use more messages and rounds than those protocols that are not resistant to guessing attacks. In this paper, we proposes a new Password Guessing Resistant Protocol (PGRP), derived upon revisiting prior proposals designed to restrict such attacks. While PGRP limits the total number of login attempts from unknown remote hosts to as low as a single attempt per username, legitimate users in most cases (e.g., when attempts are made from known, frequently-used machines) can make several failed login attempts before being challenged with an ATT. By Adding Public Key Encryption algorithms (e.g., RSA) we can achieve good Performance. We analyze the performance of PGRP with two real-world data sets and find it more promising than existing proposals.
Full Paper

IJCST/42/4/B-1581
163 Optimizing The Network Failures by Self-Determining Acyclic Digraph

K.V.Uma Karuna Devi, T.Padmaja

Abstract

As the Internet takes an increasingly central role in our communications infrastructure, the slow convergence of routing protocols after a network failure becomes a growing problem. To assure fast recovery from link and node failures in IP networks, we present a concept Directed Acyclic Graph. We develop algorithms to compute link-independent and node-independent DAGs. The algorithm guarantees that every edge other than the ones originating from the root may be used in either of the two nodedisjoint DAGs in a two-vertex-connected network. The algorithms achieves multipath routing, utilize all possible edges, guarantee to recover the single node failures and reduce the number of overhead bits required in the packet header. Moreover, the use of multiple non disjoint paths is advantageous in load balancing and preventing snooping on data, in addition to improving resiliency to multiple link failures.
Full Paper

IJCST/42/4/B-1582
164 Efficiently Providing Cloud Computing Security

G.Ujwala Naidu, Y.Sankara Rao

Abstract

Cloud storage is a model of networked online storage where data is stored in virtualized pools of storage which are generally hosted by third parties. Organizations cite data confidentiality as their serious concern for cloud computing, with encrypted data stored on third party’s cloud system, The functionality of the storage system is limited when general encryption schemes are used for data confidentiality. With this consideration, we propose Key servers and storage servers are constructed in this system. Encrypted messages are stored in storage servers. Key servers contain shared secret keys which can perform partial decryption. A user can store cryptographic keys in key servers. These key servers are working with high security mechanisms. Number of Storage servers is much more than the number of message blocks to ensure robustness of data. Erasure coding helps the system to encode encrypted messages. Decentralized erasure coding will store the data as codeword symbols in different storage servers.This method can be used as an alternative for storing the replica of message objects in different storage servers of cloud.
Full Paper

IJCST/42/4/B-1583
165 A Novel Approach for Attribute Based Encryption in Cloud Computing

S.Sailaja, Sayyed Nagul Meera

Abstract

In Attribute-Based Encryption (ABE) scheme, attributes play a very important role. Attributes have been exploited to generate a public key for encrypting data and have been used as an access policy to control users’ access. The access policy can be categorized as either key-policy or cipher text-policy. In this paper, we describe a CP-ABE based encryption scheme that provides fine-grained access control. In a CP-ABE scheme, each user is associated with a set of attributes based on which the user’s private key is generated. Contents are encrypted under an access policy such that only those users whose attributes match the access policy are able to decrypt.
Full Paper

IJCST/42/4/B-1584
166 A Novel Approach for Group Key Management Scheme in Ad-Hoc Mobile Network

Mohammed Gulzar C, S. Jayakanth

Abstract

In Mobile Ad-Hoc Networks (MANETs), most of current research works in key management can efficiently handle only limited number of nodes. When the number of nodes is gradually increasing, they will become either more inefficient or more insecure. Therefore, how to develop key management schemes for efficiently and securely satisfying the dynamic property of MANETs is a crucial issue. In this paper, we propose a Simple and Efficient Group Key (SEGK) management scheme for MANETs. Group members compute the group key in a distributed manner.
Full Paper

IJCST/42/4/B-1585
167 Protected and Best-Organized System Under Wireless Sensor Networks

Y.Harika, Balusa Anil Kumar

Abstract

Most large-scale sensor networks are expected to follow a twotier architecture with resource-poor sensor nodes at the lower tier and resource-rich master nodes at the upper tier. Master nodes collect data from sensor nodes and then answer the queries from the network owner on their behalf. In hostile environments, master nodes may be compromised by the adversary and then instructed to return fake and/or incomplete data in response to data queries. Such application-level attacks are more harmful and difficult to detect than blind DoS attacks on network communications, especially when the query results are the basis for making critical decisions such as military actions. In this paper, we propose SafeQ (Secured and Efficient Query Processing), a protocol that prevents attackers from gaining information from both sensor collected data and sink issued queries. SafeQ also allows a sink to detect compromised storage nodes when they misbehave. To preserve privacy, SafeQ uses a novel technique to encode both data and queries such that a storage node can correctly process encoded queries over encoded data without knowing their values. To preserve integrity, we propose two schemes—one using Merkle hash trees and another using a new data structure called neighborhood chains—to generate integrity verification information so that a sink can use this information to verify whether the result of a query contains exactly the data items that satisfy the query. To improve performance and Security, we propose a Cramer–Shoup technique in the communication between sensors and storage nodes.
Full Paper

IJCST/42/4/B-1586
168 Proficiently Organizing the Protection & Integrity of Users Data in a Cloud Computing

Dinari Ravi Kumar, Goda.Srinivasarao

Abstract

Cloud computing requires companies and individuals to transfer some or all control of computing resources to Cloud Service Providers (CSPs). Such transfers naturally pose concerns for company decision makers. In a recent 2010 survey by Fujitsu Research Institute [1] on potential cloud customers, it was found that 88% of potential cloud consumers are worried about who has access to their data, and demanded more awareness of what goes on in the backend physical server. To overcome the above problems, we propose a novel approach, namely Cloud Information Accountability (CIA) framework, based on the notion of information accountability, apart from this ARIES algorithm is used to recover loging information.
Full Paper

IJCST/42/4/B-1587
169 Proficiently Keep Up Integrity and Correctness of Storage in Cloud Computing

Sreenadh Reddy, Goda.Srinivasarao

Abstract

Cloud computing has gained a lot of hype in the current world of I.T. Cloud computing is said to be the next big thing in the computer world after the internet. Cloud computing is the use of the Internet for the tasks performed on the computer and it is visualized as the next-generation architecture of IT Enterprise. The ‘Cloud’ represents the internet. Cloud computing is related to several technologies and the convergence of various technologies has emerged to be called cloud computing. In comparison to conventional ways Cloud Computing moves application software and databases to the large data centers, where the data and services will not be fully trustworthy. In this article, I focus on secure data storage in cloud; it is an important aspect of Quality of Service. To ensure the correctness of users’ data in the cloud, I propose an effectual and adaptable scheme with salient qualities. This scheme achieves the data storage correctness, allow the authenticated user to access the data and data error localization, i.e., the identification of misbehaving servers.
Full Paper

IJCST/42/4/B-1588
170 Detecting Intruders in Mobile Ad-Hoc Networks

P.Ravi Kumar, S. Kanthi Kiran

Abstract

Mobile adhoc networking has become an exciting and important technology in recent year because of the rapid proliferation of wireless devices. But , It is highly vulnerable to attacks due to the open medium, dynamically changing network topology, cooperative algorithms, and lack of centralized monitoring and management point. The security of data becomes more important with the increased use of commercial application over wireless network environments. There were several problems of security in wireless networks due to different types of attack and intruders. There are many security attacks in MANET and DDoS (Distributed denial of service) is one of them. Our main aim is seeing the effect of DDoS in routing load, packet drop rate, end to end delay, i.e. maximizing due to attack on network. In this paper, we focus on mobile ad hoc network’s routing vulnerability and analyzes the network performance against the attacks. The resistive schemes against attacks were proposed for Ad hoc on demand Distance Vector (AODV) routing protocol and the effectiveness of the schemes is validated. In proposed system, while implementing we are going to use md5 algorithm for providing security against attacks and for detecting the intruders.
Full Paper

IJCST/42/4/B-1589
171 Annotation Based Fast Navigation of Web-Data Retrieval

Amit Kumar Yadav, Roshni Dubey

Abstract

Annotation of web pages is an area of research which is getting lot of attention as the count of websites of specific topics and as a whole is increasing very fast. Since all the databases are accessible over web through HTML representations and data extraction over web is becoming more and more dynamic. Such data is huge and for applications such as online shopping comparison, article collection etc. Annotation of such collected information leads to several advantages including fast decision making, relevant information visiting, to reduce the time of futile searches, historical data management and elimination of older searches. This paper is intended to provide an insight of the annotation techniques and application of few techniques to provide the required results with the above stated advantages. Works of various researchers in the field of annotating data has been more on limited tokens and focus is on creating dynamic annotations only. This work proposes to apply dynamic annotations on web sites data with tokenization done using all sort of tokens including long text having no specific tokens. For machine learning and training frequency based annotations, common knowledge annotators and schema value annotators are being applied which are going to facilitate for correct annotation process. For annotation website pages shall be looked for content type, presentation style, data type, tag path and adjacencies of the contents.
Full Paper

IJCST/42/4/B-1590
172 Effect of Feedforward Back Propagation Neural Network for Breast Tumor Classification

Rajeshwar Dass, Sanjeet

Abstract

Now these days, breast cancer is the most common disease found in women. Mammography is preferred for the early detection of presence of tumor inside the breast. This paper presents an approach based on feedforward back propagation neural network (FFBNN) for breast tumor classification. Statistical texture features are extracted from mammograms and suitable features are selected and used to train the FFBNN. The fully trained network with different number of neurons in hidden layer is tested with unknown inputs and performance of the FFBNN method is evaluated in the form of accuracy, specificity, sensitivity, and precision for the classification of breast tumor.
Full Paper

IJCST/42/4/B-1591
173 Identifying Customers’ Preference of Trust Factors in Adoption of B2C E-Commerce in India

Baljeet Kaur, Sushila Madan

Abstract

Internet penetration has changed the traditional ways of doing business. Globally, there is a steep rise in the people buying over Internet or doing Ecommerce, as we say it. Ecommerce has great potential in India, especially in the era of busy lifestyles, scarcity of time, bad traffic jams and availability of attractive offers online. Deeper Internet and 3G penetration, soaring living standards, superfluous income, better deals online and cash on delivery option; Indians have all the reasons to practice online shopping. But, there is other side to the story as well. Despite adequate reasons to go for online shopping, many Indians still do not trust e-vendors. They are apprehensive about the security and privacy of their information, product quality, credit card fraud, product delivery, availability of returns or exchanges and authenticity of the products. So, the significance of consumers’ trust in ecommerce cannot be overlooked. This paper provides an overview of customers’ preference of trust factors existing in Indian Ecommerce market space. The study attempts to help e-vendors to understand the expectations of the customers better and enhance their commercial websites in order to boost their sales. On the other hand, it provides finer insight to the customers with a view to indulge in online shopping more advantageously. This research is based on the inquisition of experts in Indian Ecommerce market. AHP Technique was used to prioritise the trust factors gathered from these experts through questionnaires.
Full Paper

IJCST/42/4/B-1592
174 Achieving Usability Quality Components in Online Information Systems

Ravi Kumar Sachdeva, Dr. Sawtantar Singh, Dr. Jai Prakash

Abstract

Abstract
Usability of an online information system greatly affects the success of the system. Achieving quality components of usability i.e. learnability, efficiency, memorability, errors and subjective satisfaction are very important for the success of any system. This paper discusses the various possible ways to achieve quality components in an online information system.Full Paper

IJCST/42/4/B-1593