IJCST Logo

International Journal of Computer Science and Technology
Vol 6.2 ver – 1 (April to June, 2015)

S.No. Research Topic Paper ID Download
01

Proposed System of Biometric Authentication Using Palm Print/Veins with Tsallis Entropy

Mohamed A. El-Sayed

Abstract
Identity verification is a general task that has many real-life applications such as access control, transaction authentication (in telephone banking or remote credit card purchases for instance), voice mail, or security. Most of the biometric applications are related to security and are used extensively for military purposes and other government purposes. The goal of an automatic identity verification system is to either accept or reject the identity claim made by a given person. Biometric authentication technology used to solve these problems, it identifies people by their unique biological features. However, a single biometric is not sufficient to meet the variety of requirements, including matching performance imposed by several large-scale authentication systems. Multimodal biometric systems seek to alleviate some of the drawbacks encountered by uni-modal biometric systems by consolidating the evidence presented by multiple biometric traits/sources. This paper proposes a multi-modal authentication technique based on palm print using Tsallis entropy, augmented by palm veins features to increase the accuracy of security recognition. The obtained results point at an increased authentication accuracy.
Full Paper
IJCST/62/1/A-0444
02

Visual Studio Professional: A Benchmark in Bioinformatics

Devadrita Dey Sarkar

Abstract
This paper aims at the basic idea behind visual studio 4.0 being an essential tool or software in the field of bioinformatics. The definitive guide to using Visual C# .NET to develop stand-alone applications for Microsoft Windows and Web-enabled Microsoft .NET applications Get the complete guidance we need to use the Visual C# .NET language to produce Windows-based applications and Web-enabled .NET applications with this comprehensive reference. It thoroughly covers the language’s structure, syntax, code wizards, and the Microsoft Visual Studio design environment, paying close attention to both the client and server sides of the .NET environment. We’ll find detailed answers and best practices to help one write, test, and debug applications and extend them to the Web-quickly and intuitively as well as explore various aspects of biology. An extensive collection of real-world programming examples demonstrates solutions to specific coding problems.
Full Paper
IJCST/62/1/A-0445
03

Using Side Information Partitioning for Efficient Classification of Web Data

Nazia Khan, Brajesh Patel

Abstract
Web data mining has been a major requirement in current era as the spread of internet and increasing data over the internet. User specific requirements are being more demanding for making websites Search Engine friendly and listing on top of search engines. But since the search engine techniques available are common hence they are not very efficient and producing false information. Internet is consisting of lot of data and normal SEO techniques mislead the users to irrelevant search results. Researchers suggest that major relevant contents are enclosed and identified in the side information within the contents. Side information includes headings, alternate text, Meta text, bold and strong elements etc.
Full Paper
IJCST/62/1/A-0446
04

Reviewing Cloud Load Balancing Architectural Challenges and Methods

Hemant Petwal, Narayan Chaturvedi

Abstract
The Network bandwidth and hardware capabilities are growing quickly, making the energetic progress of the internet. A new computing form called cloud computing uses low-power hosts to accomplish improved reliability. The cloud computing is an Internet based progress that is very scalable and virtualized resources are endowed as an ability to rum above the Internet. The cloud computing uses a class of arrangements and requests that retain distributed resources to present resources in a decentralized manner. Cloud computing uses the available computing resources over the web to enable the completion of complex tasks that need large-scale computations. Thus, the selecting nodes for giving a task in the cloud computing have to be believed, and to exploit the effectiveness of the resources, they have to be properly selected according to the properties of the task.
Full Paper
IJCST/62/1/A-0447
05

Ranking Popular Itemset

N.Rajender, V.Raju, S.Rajesh

Abstract
The problem of ranking popular items is getting increasing interest from a number of research areas. Several algorithms have been proposed for this task. The described problem of ranking and suggesting items arises in diverse applications include interactive computational system for helping people to leverage social information; in technical these systems are called social navigation systems. The ranking and suggesting items arises in diverse applications including search query suggestions and tag suggestions for social tagging systems. Several algorithms are studied and proposed for ranking and suggesting popular items, provide analytical results on their performance, and present numerical results obtained using the inferred popularity of tags from a month-long crawl of a popular social bookmarking service. The main goal of this paper is to quickly learn the true popularity ranking of items and suggest true popular items.
Full Paper
IJCST/62/1/A-0448
06

Dimensionality Reduction Problems Targeting Private Cloud Environments

Sangeetha Balodhi, Dr. Bhasker Pant

Abstract
Cloud data such as fMRI and multidimensional images usually have very high dimensionality. Such High dimensional datasets present many mathematical challenges as well as many opportunities. For cloud environments for processing the data reduced representations of such data must have a lower dimensionality so that it can represent the data correctly. The mathematical procedures making possible this reduction are called dimensionality reduction techniques; they have widely been developed by fields like Statistics or Machine Learning. However very few procedures are available that can efficiently and accurately map higher dimensions of such datasets over clouds.
Full Paper
IJCST/62/1/A-0449
07

Survey of Smartphone Based Cloud Computing Architecture Supporting Mobile Client Accessing Web Services

Kanika Mehta, Deepak Bagga

Abstract
Cloud computing in smart-phones is the combination of smartphones and cloud computing based web services. It is used for information and applications without the need of complex and costly hardware and software. In this paper there is discussion on architectures of cloud computing and also the challenges occurred in those. Consuming Web Service from a smart-phone is different compared to the standard Web Service scenarios due to the following factors: Mobile devices have limited resources in terms of CPU power and screen size. The communication in smart-phone is established through wireless network. Existing services in the cloud are not supported in Smartphone. A serviceoriented architecture is essentially a collection of services which communicate with each other.
Full Paper
IJCST/62/1/A-0450
08

Image Contrast Enhancement Techniquesbased on Dynamic Range of Histogram

Dheeraj S.Patil, D.Y.Loni

Abstract
Image contrast enhancement plays very important role in digital image processing, Histogram Equalization (HE) technique is mostly used for contrast enhancement of image, in this paper two different approaches for image contrast enhancement are implemented first method is called DRSHE and second one is Adaptive histogram equalization, DRSHE divides the dynamic range of histogram into k parts after this itresizes the grayscale range depending on the area ratio. Then histogram intensities are redistributed in resized grayscale rangeuniformaly. This method uses Weighted Average of Absolute color Difference (WAAD) in order to emphasize the edge of original image. Linear adaptive scale factor is used to control excessive changes in brightness. The results show us that DRSHE retains naturalness of the original image as compared with conventional methods of contrast enhancement. The both techniques are used for enhancing the contrast of video, just by getting the frames of video and by applying the both techniques of contrast enhancement to the each frame of video, after applying the contrast enhancement techniques, again contrast enhanced frames and recombined and video is obtained.
Full Paper
IJCST/62/1/A-0451
09

Anomaly Detection for Intelligent Video Surveillance: A Survey

G.Gayathri, S.Giriprasad

Abstract
Video anomaly detection plays a critical role for intelligent video surveillance. For an abnormal event detection spatial and temporal contents are considered. Using spatio-temporal video segmentation a new region based descriptor motion context is formed. The basic patch descriptor groups the neighbourhood pair. Using dynamic patch grouping the same frames are merged. Then datasets are prepared and searched for best match. Dynamic threshold is determined. The datasets are compared with the original pattern. RGB colour variation can be used to improve the image quality.
Full Paper
IJCST/62/1/A-0452
10

Virtualization and Consolidation of Server and Related Security Issues

Arpit Chhabra, Manav Bansal

Abstract
Although ‘virtualization’ as a concept is not new, currently, ‘server virtualization’ has become a hot topic in the IT industry and everyone is talking about it. It is deeply connected with the topic of ‘green IT computing’ (other synonymous terms being green computing and green IT).
Many organizations seem to be in the rush to adopt server virtualization in their attempt to jump onto the ‘go-green’ wagon. It is, therefore, important to be familiar with such a current and hot topic. When ‘server virtualization’ is mentioned, almost invariably, comes up the mention of ‘blade servers’.
Full Paper
IJCST/62/1/A-0453
11

An Approach on Semi Distributed Grouping for Cloud Computing Systems

Lavanaya. S

Abstract
Cloud computing is a stipulation of as long as networked, on-line, on-demand services forfeit for each exploit basis. Several issues as security, scalability and performance etc are converse so far by many researchers for the cloud computing. Cloud partitioning is an optimal method for public cloud. At this juncture public cloud environment numerous nodes are used with essential computing resources situated in discrete geographic locations, as a result this approach simplifies the load distribution athwart the several nodes, excluding fault tolerance as well as load balancing are most essential problems obtaining high concert in the procedure. Load balancing is the progression of dissemination of workload between different nodes. The intention of load balancing is to improve the performance of a cloud environment through an appropriate distribution strategy.
Full Paper
IJCST/62/1/A-0454
12

An Analysis of Macedonia’s Position in the World Ranking in Years for E-government and the Development of an (Online) Application for the Submission of Corruption

Burhan RAHMANI, Florim IDRIZI

Abstract
In today’s world each activity is conducted through technological equipments as well as through the internet; it is very crucial for a country to be transparent for its citizens and for each service to be able to be conducted online, through the internet. Currently Macedonia is not in the best world ranking for e-government despite many attempts done to improve this field. On this paper we shall strive to bring forth an analysis on the position of Macedonia in the world rank list for e-government in the last years and we shall bring one option by developing one web application for the submission of corruption for the governmental institutions.
Full Paper
IJCST/62/1/A-0455
13

Enhanced QOD Routing Protocol With Link Aware Opportunistic Relay Node Selection Algorithm

Padmaja. P, B.S.Venkata Reddy

Abstract
As wireless communication obtain popularity, significant research has been dedicated to supporting real-time transmission with stringent Quality of Service (QoS) requirements for wireless applications. At the equivalent instant, a wireless hybrid network that integrates a mobile wireless ad hoc network (MANET) and a wireless infrastructure network has been proven to be a better alternative for the next generation wireless networks. By directly agree to resource reservation-based QoS steering for MANETs, hybrids networks accede to illogical reservation and race condition exertion in MANETs. How to secure the QoS in hybrid set of connections remains an unwrap predicament. With this we insinuate a QoS-Oriented Distributed routing protocol (QOD) to augment the QoS support capacity of hybrid networks. gripping advantage of less transmission hops and any cast transmission features of the hybrid networks, QOD transforms the sachet routing predicament to a resource development problem. QOD incorporates five algorithms: (1) a QoS-guaranteed neighbor selection algorithm to meet the diffusion delay obligation, (2) a distributed packet scheduling algorithm to further reduce communication delay, (3) a mobility-based fragment resizing algorithm that adaptively adjusts segment size according to node mobility in order to reduce transmission time, (4) a traffic outmoded elimination algorithmto increase the transmission throughput (5) a data redundancy exclusion-based transmission algorithm to eliminate the redundant data to further improve the transmission QoS.
Full Paper
IJCST/62/1/A-0456
14

Social Learning in Stock Market: Prediction Model

Dudhat Ankit Kumar M, Prof. R. R. Badre, Prof. Mayura Kinikar

Abstract
Business and financial news bring us the latest information about the stock market. Studies have shown that business and financial news have a strong correlation with future stock performance. Therefore, extracting sentiments and opinions from business and financial news is useful as it may assist in the stock price predictions. In this paper, we present a sentiment analyser for financial news articles using lexicon-based approach. We use polarity lexicon to identify the positive or negative polarity of each term in the corpus.
Full Paper
IJCST/62/1/A-0457
15

The Protected Recognition Foray Schema for Manet’s – EAACK

Rajesh Pilla, Reddi Prasadu

Abstract
Wireless networks are using enormously, for the reason of its mobility, scalable features. Of all the available wireless networks, Mobile Ad-hoc NETworks (MANET) is the most important and typical application. MANET does not require a fixed network infrastructure; every single node works as both a transmitter and a receiver. MANET has the changing topology and it does not have the fixed network infrastructure. Each node act both as the transmitter and receiver and node configuration is done on its own. Nodes communicate among themselves either directly or with the help of neighbors. The open medium allows MANET vulnerable to attacks. In existing system Enhanced Adaptive Acknowledgment(EAACK) method is imposed, in this digital signature method is used which cause network overhead. Thus proposed system specifies the Hybrid Cryptography technique isused to reduce network overhead.
Full Paper
IJCST/62/1/A-0458
16

Energy Efficient Naive Bayes Prediction Model for Data Reduction in WSN

Thaker Maulik B., Prof. Uma Nagaraj, Prof. Pramod D. Ganjewar

Abstract
Wireless Sensor Network is an emerging technology deals with many applications like environmental monitoring, health system, military applications, agriculture etc. Dealing with huge amount of data transmission is a serious issue of WSN and thus we to introduce data prediction based reduction in Wireless Sensor Network because of energy consumption is main limitation of WSN when transmit a data from source node to sink node. In this paper, we proposed our framework named NBP (Naive Bayes Prediction) model for WSN to reduce the data as well as saving the energy of network.
Full Paper
IJCST/62/1/A-0459
17

Combination Methodology for Protected Formal Deduplication in Cloud Computing

Venkateswara Rao Akkireddy, K.S.N.Murthy

Abstract
Data deduplication is an significant method in favour get rid of redundant data as an alternative of captivating files, it provisions merely distinct copy of file. Among the entire organizations many organizations, storage schema enclose more pieces of duplicate data. . For example, Different users stores similar files in several different places. Deduplication abolish these additional copies by saving as single copy of data and reinstate the other copies along with pointers that flipside to the original copy. It be the data compression technique for to increase the bandwidth efficiency and storage exploitation. Data deduplication is enormously using in cloud computing now a days.
Full Paper
IJCST/62/1/A-0460
18

Review of Cloud Security for E-Learning System

Piyushika, Prof. Kedar Nath Singh

Abstract
Cloud computing is spreading around the world and causing the researchers to focus on it. These are first, making it possible to communicate between two or more clouds and second, security of communication. With emergence of cloud computing, the term Hybrid Deployment” is becoming more and more common. Definition of “Hybrid Deployment” is when you join different cloud deployments into one connected cluster. Hybrid Cloud computing mainly deals with working of data centers where different software are installed with huge growing data to provide information to the users of the system. The techniques which can be used in hybrid cloud securities are to share the challenge text between the clouds before actual communication should start for authentication. The various works done in this area till date are oriented on other techniques of security between the two or more clouds in a hybrid cloud.
Full Paper
IJCST/62/1/A-0461
19

Image Retrieval Using Bilinear Similarities for Large Datasets

Ch.Pradeep Kumar, Dr. Gorti Satyanarayana Murty

Abstract
It is an image retrieval approach which provides a method of retrieving an image from the large dataset by using an image as query which selects all of its variants which are more relevant to the query image. It makes use of a single bi-linear similarity measure for image retrieval. Content based image retrieval extracts the Images as per their features. Because it is a huge problem to retrieve the required images from the image database very frequently. We the users are always not satisfied with the given technologies they used in present time they always look forward for further improvement in the image retrieval process. The CBIR focuses on image features. It retrieves images from the database which are semantically correct based on the query image.
Full Paper
IJCST/62/1/A-0462
20

Automatic Plant Leaf Classification on Mobile Field Guide

Snehal Shejwal, Prachi Nikale, Ashwini Datir, Ashvini Kadus, Pratik Bhade, Rahul Pawar

Abstract
Various plant species are available on earth but it is very difficult to identify the plant and its usage. The plant can verify by extracting the feature of leaves. The digital image processing is necessary used tool for the feature extraction and to get the pattern of studied images. There are several techniques are available to perform such task but it is very difficult to find such method that gives accurate and efficient result. To resolve this problem, our project aim is to present the application which is used for classification of the plant on a Smartphone. Leaf is classify based on its extracted feature is called as plant leaf classification. The techniques used for classification of leaf are k-nearest neighbor (k-NN), Probabilistic Neural Network (PNN), Support Vector Machine and Fuzzy logic. The goal of this project is to provide fastest application which gives accuracy. SQLite is an Open Source Database which is embedded into Android. SQLite supports standard relational database features like SQL syntax, transactions and prepared statements.
Full Paper
IJCST/62/1/A-0463
21

Recent Issues and Challenges on Big Data in Cloud Computing

Dr. Jangala. Sasi Kiran, M.Sravanthi, K.Preethi, M.Anusha

Abstract
We live in on-demand, on-command digital universe with data rapid reproducing by Institutions, Individuals and tools at very high rate. This data is categorized as “Big Data” due to its absolute Volume, Variety, Velocity and Veracity. Most of the data is partly structured, unstructured or semi structured and it is heterogeneous in nature. Due to its specific nature, Big Data is stored in distributed file system architectures. Hadoop and HDFS by Apache are widely used for storing and managing Big Data. Analyzing it, is a challenging task as it involves large distributed file systems which should be fault tolerant, flexible and scalable. Cloud computing plays a very vital role in protecting the data, applications and the related infrastructure with the help of policies, new technologies, controls, and big data tools. Moreover, cloud computing, applications of Big data, and its advantages are likely to represent the most promising new frontiers in science. The technology issues, like Storage and data transport are seem to be solvable in the near-term, but represent long term challenges that require research and new paradigms. Analyzing the issues and challenges comes first as we begin a collaborative research program into methodologies for big data analysis and design.
Full Paper
IJCST/62/1/A-0464
22

Self Adaptive Utility-Based Routing Protocol (SAURP)

Rakhi S. Belokar

Abstract
This report introduces a novel multi-copy routing protocol, called Self Adaptive Utility-based Routing Protocol (SAURP), for Delay Tolerant Networks (DTNs) that are possibly composed of a vast number of devices in miniature such as smart phones of heterogeneous capacities in terms of energy resources and buffer spaces. SAURP is characterized by the ability ofidentifying potential opportunities for forwarding messages to their destinations via a novel utility function based mechanism, in which a suite of environment parameters, such as wireless channel condition, nodal buffer occupancy, and encounter statistics, are jointly considered. Thus, SAURP can reroute messages around nodes experiencing high buffer occupancy, wireless interference, and/or congestion, while taking a considerably small number of transmissions. The developed utility function in SAURP is proved to be able to achieve optimal performance, which is further analyzed via a stochastic modeling approach. Extensive simulations are conducted to verify the developed analytical model and compare the proposed SAURP with a number of recently reported encounter-based routing approaches in terms of delivery ratio, delivery delay, and the number of transmissions required for each message delivery. The simulation results show that SAURP outperforms all the counterpart multi-copy encounter-based routing protocols considered in the study.
Full Paper
IJCST/62/1/A-0465
23

Hybrid Approach for Translation of Common English Phrases to Punjabi

Aman Aggarwal, Rohit Sethi

Abstract
This paper belongs to general conversion of one language phrases to another language phrases using translation techniques. Transliteration is the process of converting alphabets in source language with their approximate phonetic or spelling equivalents alphabets in target language. In this case we have taken two different languages, one is Punjabi and another is English. We have done the work regarding the conversion of common English phrases to Punjabi language phrases. Also one more attempt made to convert Punjabi Language phrases to its corresponding English slangs with same pronunciation means whosoever reads the English written words or phrases the meanings and language will be spoken Punjab itself. So this thing makes a unique idea for those who just know English language but don’t understand and know Punjabi language but still able to convey the message in Punjabi language without concern about the meaning and understanding to other people. Full Paper
IJCST/62/1/A-0466
24

A Survey on Challenges and Advantages in Big Data

LENKA VENKATA SATYANARAYANA

Abstract
Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. The exciting activities addressing the big data challenge. The central theme is to connect big data with people in various ways. Particularly, This paper will showcase our recent progress in user preference understanding, context-aware, on-demand data mining using crowd intelligence, summarization and explorative analysis of large data sets, and privacy preserving data sharing and analysis.The primary purpose of this paper is to provide an in-depth analysis of different platforms available for performing big data analytics. This paper surveys different hardware platforms available for big data analytics and assesses the advantages and drawbacks of Big Data
Full Paper
IJCST/62/1/A-0467
25

SystemVerilog Based Verification Environment for Wishbone Interface of AHB-Wishbone Bridge

Bhankhar Dhvani, Samir Shroff

Abstract
System Verilog is the industry’s first unified Hardware Description and Verification Language (HDVL). It became an official IEEE standard (IEEE 1800™) in 2005 under the development of Accellera [1]. This is the first time that constructs have been made available to both digital design and verification engineers. A verification environment which is based on a constrained random layered testbench using SystemVerilog is implemented in this paper to verify the functionality of DUT designed with synthesizable constructors of SystemVerilog. This new verification constructs can be easily reused for the objected-oriented feature of SystemVerilog. In this paper, a uniform verification environment for AHB2WB Bridge is developed using SystemVerilog after a comprehensive analysis of the verification plan. The proposed multi-layer testbench is comprised of generator, bridge driver, agent, scoreboard, checker, testcases and assertions, which are implemented with different properties of SystemVerilog.
Full Paper
IJCST/62/1/A-0468