IJCST Logo

International Journal of Computer Science and Technology
Vol 7.1 ver – 1 (Jan – Mar 2016)

S.No. Research Topic Paper ID Download
1

Routing Layer Design issues and Protocols for Wired and Wireless Networks

Chaganti B N Lakshmi, Dr. S.K.Mohan Rao

Abstract

A Computer Network is a collection of interconnected and independent computers to provide communication among different users. The software required to operate on this computer network is organized as a set of layers: Physical, Data link, Network, Transport and Application layer. Each layer has distinctive functionalities to do. The protocols at these layers are collectively called as protocols stack. In this paper more focus is given on the issues to be considered in the design of protocols at the network layer and an overall view is given on the existing routing protocols for wired and wireless networks.
Full Paper

IJCST/71/1/A-0623
2

Implementation of Experimental Test Bed to Evaluate Security in Cellular Networks

Nasibeh Mohammadzadeh, Mohsen Hallaj Asghar, Dr. N. Raghu Kisore

Abstract

The wide development of interconnectivity of cellular networks with the internet network has made them to be vulnerable. This exposure of the cellular networks to internet has increased threats to customer end equipment as well as the carrier infrastructure. If is very difficult to deal with the full fledged infrastructure in a typical service provider network due to operational constraints. However still it’s very important for an analysis to evaluate the vulnerabilities and take connective action. This paper introduces the modeling of a small cellular network with security weakness (2G, 2.5G, 3G).This is done by using of the shelf hardware components to set up an experimental test bed. The vulnerability of such cellular networks is attempted with the construction of a
ghost GSM tower through shelf hardware components. This paper also discusses concepts of test bed GSM security and possibility of conducting man-in-the-middle attack on the cellular networks model.
Full Paper

IJCST/71/1/A-0624
3

The Accompanying Self-assurance Endorsement in Differentiated Web Search

V.V.Surya Sasank, S.Anuradha

Abstract

The search of information on the World Wide Web is growing rapidly; web records must have the ability to recoup information according to the customer’s slant. Recurring pattern web records are manufactured to serve all customers, self-sufficient of the special needs of any individual customer. Personalization of web interest is to finished recuperation for each customer melding his/ her favorable position. Every customer has a specific establishment and a specific target when chasing down information on the Web. Thusly the goal of Web interest personalization is to tailor recorded records to a particular customer in light of that customer’s leverage and slants. In any case, fruitful tweaked interest requires assembling and gathering customer information, which routinely raises veritable stresses of security infringement for a few customers. Truly, these stresses have been able to be one of the essential hindrances for sending modified look applications, and how to do insurance sparing personalization is a marvelous test. Along these lines, an equality must struck between request quality and security certification. Therefore, security protection in PWS applications that model customer slants as different leveled customer profiles is proposed using a PWS framework called UPS that can adaptively total up profiles by inquiries while in regards to customer showed insurance necessities. Close by Personalized Search and Privacy Protection the Custom Search handiness will in like manner be given so that the customers get huge information.
Full Paper

IJCST/71/1/A-0625
4

Facial Detection & Recognition Using Open CV library

Manav Bansal, Sohan Garg

Abstract

The evolving interest in computer vision technologies of the last decade. Fuelled by the commanding doubling rate of computing power every 12 months, face detection and recognition technologies has transcended from an outlandish to a popular area of research in computer vision and one of the better and productive applications of image analysis and algorithm based calculations. Because of the intrinsic nature of the problem, computer vision is not only a computer science area of research, but also the object of neuro-scientific and psychological studies, mainly because of the general opinion that advances in computer image processing and understanding research will provide insights into how our brain work and vice versa.
Full Paper

IJCST/71/1/A-0626
5

The Report Analysis and Characteristics of Behaviour Driven Development with Cucumber Model

Dr. Ravi Saripalle, Buddharaju Shanmukh Varma, Raghukanth Reddy Gudimetla, Sudeepa Gorle

Abstract

Behaviour Driven Development (BDD) has gained increasing attention as an agile development approach in recent years. However, characteristics that constitute the BDD approach are not clearly defined. In this paper, we present a set of main BDD characteristics identified through an analysis of relevant literature and current BDD toolkits. Our study can provide a basis for understanding BDD, as well as for extending the existing BDD toolkits or developing new ones.
Full Paper

IJCST/71/1/A-0627
6

Dynamic Semi-Group SFK Pattern on Complex Operating System for Optimizing the Risk

Padma Lochan Pradhan

Abstract

Now-a-days,increasing the importance of business and resources over a complex RTS and growing the external risks is a very common phenomenon. The system risks put forward to the senior management focus on complex risk on RTS. The senior management has to decide whether to accept expected losses or to implement into security mechanisms in order to minimize the down time of risk on complex infrastructure. This paper contributes to the development of an optimization model that aims to determine the optimal cost to be invested into UFS mechanisms that,the allocation& distribution of measure components on operating system and relevant resources (i.e. Shell, File and Kernel). Our SFK patternshould be design in such way, the file systems, shell and kernel automatically protected, detected & corrected all the time. Wehave to reduce the system risk by implementing SFK pattern based on semi-group structure, mean while improving the highest access control on the File, Memory and Processor & Kernel system. Finally, we have to maximize the performance, reliability, fault tolerance& minimize the cost, time of the RTOS over a complex web application. Our objective is that fix up the risk at optimal level with minimal cost and time.
Full Paper

IJCST/71/1/A-0628
7

Secure Yourcloud Using Part of Yourself: “Biometrics”

Dr. Nikita Yadav, Dr. Garima Yadav

Abstract

Cloud? Yes, cloud is the buzzword in now days. Everyone everywhere is talking about it. Cloud computing has gained
popularity very fast because of its incredible advantages some of them involves cost efficient, almost unlimited storage, backup and recovery, automatic software integration, quick deployment, easy access to information and so on. Despite of having all these advantages there is problem or can say challenge in this technology is security. Security you or we can say is the biggest challenge, threat or question to every technology which is faced all over world the world. At present biometric is the answer to this. In this paper we use biometrics to make our cloud secure.
Full Paper

IJCST/71/1/A-0629
8

The Security Method Provocation for User Assigned Images on Content Sharing Website

K.Ramya Krishna, Smitha Rani Sahu

Abstract

Present days individuals offer numerous individual images on informal community which requires looking after security. Protection is required to keep the abuse of such images. For keeping these images secure different protection settings are required. In the event that an instrument is given to the client which will make him set protection effectively, this will lessen his undertaking. For tending to this need a few strategies are proposed. In this paper some security proposal methods are talked about. These procedures prescribe protection to client for images. For prescribing such security, client profile data and properties of images are utilized. Labels identified with images and visual properties likewise critical to characterize images. Online networking’s grown to be a standout amongst the most critical piece of our everyday life as it empowers us to speak with many individuals. Making of informal communication locales, for example, MySpace, LinkedIn, and Facebook, people are offered chances to meet new individuals and companions in their own furthermore in the other different groups over the world. Clients of long range interpersonal communication administrations impart a plenitude of individual data to an expansive number of “companions.” This enhanced innovation prompts protection infringement where the clients are sharing the huge volumes of images crosswise over more number of people groups. This security should be taken consideration with a specific end goal to enhance the client fulfillment level. The objective of this study is to give a far reaching audit of different protection strategy ways to deal with enhance the security of data partook in the online networking destinations.
Full Paper

IJCST/71/1/A-0630
9

Analysis of Different Brain Tumor Detection and Segmentation Techniques in MRI and other Medical Images

Saumya Gupta, Monika Agrawal, Sanjay Kumar Sharma

Abstract

Brain is that the anterior most a part of the central nervous system. Tumor is caused because of formation of additional cells in brain as a result of new cells build up whereas existences of older or broken cells for an unknown reason. Today’s recent medical imaging analysis faces the challenge of detective work tumor through magnetic resonance pictures (MRI). Broadly, to provide pictures of sentimental tissue of body, MRI pictures are used by specialists. For tumor detection, image segmentation is needed. Physical segmentation of medical image by the radiotherapist may be a monotonous and prolonged method. Imaging may be an extremely developed medical imaging methodology providing made info concerning the person soft-tissue structure. There are varied tumor recognition and section strategies to find and segment a tumor from imaging pictures. A range of algorithms were developed for segmentation of magnetic resonance imaging images by exploitation completely different tools and methods. Instead this paper presents a comprehensive review of the ways and techniques accustomed find tumor through imaging image.
Full Paper

IJCST/71/1/A-0631
10

Ensured information Recover for Scattered Interference Military Tolerant Networks

B. Sindhu, Smitha Rani Sahu

Abstract

There are measures in military circumstances, for instance, a battle area or an antagonistic region. They are inclined to encounter the evil impacts of spasmodic framework system. They are having progressive portions. Intrusion tolerant framework DTN progressions are is a certifiable and basic courses of action. DTN is a Disruption-tolerant framework. It grants contraptions which are remote and passed on by social orders in a military to participate with each other. These devices get to the mystery information or summon reliably by abusing external limit hubs. In these frameworks organization circumstances DTN is outstandingly productive advancement. Right when there is no wired relationship between a source and a destination device, the information from the source hub might need to sit tight amidst the street hubs for a great deal of time until the affiliation would be adequately settled. One of the testing technique is an ABE. That is trademark based encryption which fulfills the necessities for secure data recuperation in DTNs. The thought is Cipher substance Policy ABE (CP-ABE). It gives a suitable strategy for encryption of data. The encryption consolidates the quality set that the translating needs in order to unscramble the figure content. From this time forward, various customers can be allowed to interpret assorted parts of data as demonstrated by the security approach. We propose a powerful structure for neutralizing range spills in Sensor Networks moreover it promises the assurance sparing arrangement against action examination and stream taking after.
Full Paper

IJCST/71/1/A-0632
11

Analyzing on Effect of U.S. Sub-Prime Crises on Five Major Stock Markets of Different Countries Using Hybrid Wavelet and Neural Network Model

M. Yasin Pir, Firdous Ahmad Shah, Mohammed Asger

Abstract

The correlation of stock returns across different markets has been widely applied to evaluate the spillover effects across stock markets. The impacts of U.S. subprime crisis of 2007 have been an important issue in academic literature during and shortly after crisis period because of its very severe effects on financial markets and reel economy in all over the world. In this paper, we investigate degree of correlation or co-movement of 5 major stock markets of 5 countries: India (Nifty), China (SHCOMP), Germany (DAX), United Kingdom (FTSE 100) and Japan (NKY) in relation to U.S. stock market (NASDAQ) independently using a hybrid wavelet and neural network model. We use a simple Multi-layer Perceptron Neural Network (MLPNN) based wavelet decomposition to analyse the relationship between these stock markets. The study indicate that the hybrid model can provide a valuable alternative to the existing conventional methodologies in testing financial contagions and better untangle the relationships between financial institutions.
Full Paper

IJCST/71/1/A-0633
12

The Competent Reverse Nearest Neighbors for Outlier Detection in High Dimensional Datal

Dora Pavani, K Rajendraprasad

Abstract

Outlier Detection in high dimensional information turns into a rising system in today’s examination in the region of information mining. It tries to discover elements that are significantly disconnected, exceptional and conflicting as for the normal information in a data database. It faces different difficulties as a result of the expansion of dimensionality. Hubness has as of late been produced as a vital idea and goes about as a trademark for the expansion of dimensionality associating with closest neighbors. Grouping likewise demonstrates an imperative part in taking care of high dimensional information and a critical device for exception recognition. This paper builds up a method where the idea of hubness, particularly the antihub (focuses with low hubness) calculation is inserted in the resultant groups got from bunching systems, for example, K-implies and Fuzzy C Means (FCM) to identify the anomalies mostly to lessen the calculation time. It analyzes the consequences of the considerable number of procedures by applying it on three distinctive genuine information sets. The Experimental results show that when every one of the three calculations are looked at, KCAnthub gives a noteworthy decrease in computational time than Antihub and FCAntihub. It is presumed that when the Antihub is connected into K-implies, it beats well. “Hubness” has as of late been recognized as a general issue of high dimensional information spaces, showing itself in the rise of articles, alleged center points, which have a tendency to be among the k closest neighbors of an extensive number of information things. As an outcome numerous closest neighbor relations out there space are hilter kilter, that is, article y is amongst the closest neighbors of x yet not the other way around. The work
exhibited here talks about two classes of techniques that attempt to symmetrize closest neighbor relations and explores to what degree they can relieve the negative impacts of center points. We assess neighborhood separation scaling and propose a worldwide variation which has the upside of being anything but difficult to rough for
huge information sets and of having a probabilistic understanding. Both neighborhood and worldwide methodologies are appeared to be viable particularly for high-dimensional information sets, which are influenced by high hubness. Both routines lead to a solid diminishing of hubness in these information sets, while in the meantime enhancing properties like characterization exactness. We assess the strategies on a substantial number of open machine learning information sets and manufactured information. At last we show a certifiable application
Full Paper

IJCST/71/1/A-0634
13

Data-Centric Security in Cloud Computing

Meena Kumari, Neelam Yadav

Abstract

With the inception of Cloud Computing in 2006, academia has paid much attention towards it. Apart from its marvelous advantages it provides, it also comes up with various concerns like loss of control, lack of trust, Governance and compliance and security attacks. One of the most critical concern regarding cloud computing is of security. To be more specific data security in cloud computing paradigm. Recently, much attention is paid on securing networks and hosts (i.e. Infrastructure) holding data, but these existing solutions are not sufficient. This paper emphasizes the need of adoption of a new approach towards data security namely data or information centric security. This paper discusses various enabling tools behind data centric security and also presents a security framework. This scheme ensures the security of data whether it is in transit or in motion. Security procedure or modules are embedded in data itself rather than the container of data.
Full Paper

IJCST/71/1/A-0635
14

Indemnity Appraisal of Pattern Classifiers Under Incursion

Borra Rekhasree, Ch Venkata Rama Padmaja

Abstract

A key control of our endeavor is that security assessment is performed experimentally, and it is subsequently information dependent; interestingly, demonstrate driven examinations require a complete logical representation of the trouble and of foe’s conduct that may be to a great degree dubious to reach out for genuine applications. Our most huge commitment is a structure that is useful towards distinctive classifiers, learning calculations, and in addition arrangement assignments. Design arrangement structures might show vulnerabilities, whose overseeing might
potentially influence their execution, and subsequently bound their practical advantage. We educate a structure for experimental evaluation with respect to classifier security that sums up crucial thoughts anticipated in the writing. To give down to earth rules to reenacting viable assault circumstances, we depict a general representation of the foe, in connection to information, and capacity, which incorporate and sum up models anticipated in before work. Our representation is on supposition that enemy demonstrations soundly to accomplish a predetermined objective, steady with the information of classifier, and capacity of controlling information which permits one to get comparing ideal assault plan. The frameworks which can be utilized for example arrangement are utilized as a part of ill-disposed application, for instance spam sifting, system interruption identification framework, biometric verification. This antagonistic situation’s misuse might here and there influence their execution and limit their reasonable utility. If there should arise an occurrence of example arrangement origination and create routines to antagonistic environment is a novel and pertinent exploration course, which has not yet sought after systematically. To address one principle open issue: assessing at imagine stage the security of example classifiers (for instance the execution debasement under potential assaults which causes amid the operation). To propose a structure for assessment of classifier security furthermore this system can be connected to various classifiers on one of the application from the spam separating, biometric verification and system interruption identification.
Full Paper

IJCST/71/1/A-0636
15

The Supporting Deduplication Reputation-based Trust Management in Cloud Storage

K.Revan Kumar, A.Swathi

Abstract

In Cloud registering includes sending gatherings of remote servers and programming organizes that permit unified information stockpiling and online access to PC administrations or assets. Mists can be delegated open, private or half and half. Cloud administration suppliers offer both profoundly accessible capacity and greatly parallel figuring assets at generally low expenses. As distributed computing gets to be pervasive, an expanding measure of information is being put away in the cloud and imparted by clients to determined benefits, which characterize the entrance privileges of the put away information. One basic test of distributed storage administrations is the administration of the perpetually expanding volume of information. Datadeduplication is a particular information pressure strategy for dispensing with copy duplicates of rehashing information. This method is utilized to enhance stockpiling usage and can likewise be connected to network information exchanges to diminish the quantity of bytes that should be sent and spare data transfer capacity. To secure the secrecy of delicate information while supporting deduplication, the united encryption system is utilized to scramble the information
before outsourcing. It scrambles/decodes an information duplicate with a joined key, which is acquired by registering the cryptographic hash estimation of the substance of the information duplicate. Joined encryption permits the cloud to perform deduplication on the ciphertexts and the evidence of proprietorship keeps the unapproved client to get to the record. To upgrade the framework in security OAuth is utilized. OAuth (Open Authorization) is an open convention for token-construct confirmation and approval with respect to the Internet utilized as a part of crossover cloud to upgrade the security. OAuth empowers the framework to guarantee that the client is a verified individual or not. Just such validated client got the token for transferring and downloading in pubic cloud.
Full Paper

IJCST/71/1/A-0637
16

A Customer Side Security Protection Structure Called Ups for Customized Web Search

P.V.Manoj, Y.Manas Kumar, M.Raja Kumar

Abstract

To keep client protection in profile-based PWS, analysts contain to think two can’t help contradicting impacts all through the inquiry method. From one viewpoint they push to show signs of improvement the hunt quality with the personalization support of the client profile. They call for to put out of sight the time alone contents alive in the user profile to rest the privacy jeopardy beneath be in charge of. Web search engine has elongated develop into the on the whole central gateway for commonplace people looking for practical information on the web. But users might practice malfunction when search engines return inappropriate results that do not meet their real intents. Such worthlessness is mostly suitable to the mammoth multiplicity of users’ frame work and setting as well as the vagueness of texts.
Full Paper

IJCST/71/1/A-0638
17

The Protection Significance of Scheme Classifiers in Offensive Confront

S.Santosh Janardan Kumar, A. Swathi

Abstract

Design pattern of exploit activity structures are generally utilized as a bit of antagonistic applications, as biometric confirmation, system interruption territory, and spam disengaging, in which information can be deliberately controlled by people to undermine their operation. As this not all around organized condition is not considered by ordinary setup strategies, outline gathering frameworks might demonstrate vulnerabilities, whose abuse might really affect their execution, and along these lines keep their supportive utility. Two or three works have tended to the issue of laying out energetic classifiers against these dangers, however fundamentally concentrating on particular applications and sorts of assaults. In this paper, we address one of the vital open issues: studying at system organize the security of delineation classifiers, especially, the execution debasement under potential strikes they might understand amidst operation. We propose a structure for Experimental evaluation of classifier security that formalizes and wholes up the guideline insights proposed in the composed work. System Security Consist of the acquirements and methods got by a structure director to defeat and screen unapproved access. Email is the standard correspondence interface now a day everybody utilizes/have mail get to all forces affiliation sent on by a mail correspondence. In this mail correspondence we will have a spam sends. Spam Emails/different E-sends includes URL’s to a regions or Webpages prompts pollution or hacking. So we beginning now have a system for perceiving the spam sends at any rate it won’t see the whole spam sends. Spamming is the utilization of Electronic messages to send/get unconstrained mass messages particularly progressing sporadically. Where as in this framework we are going to perceive the whole spam by method for email examining before it read by the clients, disappointing the space self-sufficient of the clients E-mail ID, catchphrase based upsetting by checking the subjects, controlling the capability in the midst of open and private region before blocking, watchword security by bio-metric, Facial Recognition, Fractal recognizing verification (face isolating) and insistence is an emerge technique to see each person. We utilize savage power string match estimation. It displays the applicant pictures of face filtering attestation framework could be seen competently utilizing spread reliance of pixels ascending out of makeover codes of images.
Full Paper

IJCST/71/1/A-0639
18

Graphical Based Registration and Authentication System

Dr. Sarita Kadian

Abstract

A user authentication is most important factor in network security. Therefore, passwords are used for high level authentication of computers and information security. Graphical password method is used to remove the complication which is coming in to security of the information. Graphical passwords are used more in those days because our mind can easily remember pictures then text. We introduce a Graphical based password authentication in this paper, its helps to exterminate uproar the attacks like dictionary attack, brute force attack. Images are easy to recall in analogous to text. Graphical password hard to guess and not easy to crack password, it is easy associated with the users.
Full Paper

IJCST/71/1/A-0640
19

The Advanced Prominent to Percolate Content Sharing Significances from OSN User Walls

Guntreddy Sandeep, Sujatha Thulam

Abstract

In the most recent years, On-line Social Networks have turned into a prominent intuitive medium to impart, share and scatter a lot of human life data. Every day and ceaseless correspondence suggests the trading of a few sorts of substance, including free content, picture, and sound and video information. The tremendous and element character of these information makes the reason for the vocation of web substance mining methodologies meant to naturally find helpful data torpid inside of the information and afterward give a dynamic backing in unpredictable and complex errands included in person to person communication investigation and administration. A principle piece of informal community substance is constituted by short content, a prominent sample are the messages for all time composed by OSN clients on specific open/private zones, brought as a rule dividers. The point of the present work is to propose and tentatively assess a computerized framework, called Filtered Wall (FW), ready to sift through undesirable messages from interpersonal organization client dividers. The key thought of the proposed framework is the backing for substance based client inclinations. This is conceivable thank to the utilization of a Machine Learning (ML) content arrangement strategy capable [4] to consequently dole out with every message an arrangement of classifications in light of its substance. We trust that the proposed system is a key administration for informal communities in that in today interpersonal organizations clients have little control on the messages showed on their dividers. For instance, Facebook permits clients to state why should permitted embed messages in their dividers (i.e., companions, companions of companions, or characterized gatherings of companions). In any case, no substance based inclinations are upheld. Case in point, it is unrealistic to avert political or foul messages. Conversely, by method for the proposed component, a client can determine what substance ought not to be shown on his/her divider, by indicating an arrangement of separating guidelines. Sifting guidelines are exceptionally adaptable regarding the separating necessities they can bolster, in that they permit to indicate sifting conditions taking into account client profiles, client connections and also the yield of the ML order process. What’s more, the framework gives the backing to client characterized boycotts, that is, rundown of clients that are briefly forestalled to post messages on a client divider. To the best of our insight this is the main proposition of a framework to consequently channel undesirable messages from OSN client dividers on the premise of both message content and the message maker connections and attributes. Real contrasts incorporate an alternate semantics for separating principles to better fit the considered area, an online setup colleague to help clients in FR particular, the expansion of the arrangement of components considered in the order handle, an all the more profound execution assessment study and a redesign of the model usage to mirror the progressions made to the characterization methods.
Full Paper

IJCST/71/1/A-0641
20

Comparison View of Different Technologies Used to Prioritize the Test Case in Regression Testing

Dr. Amit Verma, Rohit Bajaj, Ishadeep Kaur Luthra

Abstract

Testing is the part of the software development and come into use in different methods. Our research is about regression testing and various techniques that have been developed so far. In regression testing various test cases are developed and testing is performed repeatedly on it. The main objective of regression testing is to find errors before the implementation of the system. Various techniques being developed on regression testing to increase the performance are like Greedy algorithm, Genetic algorithm, Particle Swarm Optimization, clustering approach and many more. In these various techniques the main concept is of prioritization of test cases, so that regression testing is performed on only validated test cases, thus improving the performance and being time efficient. This can be used for analyzing various tools and techniques that can be implemented with existing algorithm and hybrid model may be proposed.
Full Paper

IJCST/71/1/A-0642
21

Optimized Image Classification Based on Universal Image Distance and Support Vector Machines

Nandita Chasta, Manish Tiwari

Abstract

Image Classification of remotely sensed images is one of the most important field of research in computer engineering. Image classification techniques are being used in object recognition, quality control and OCR systems. Many of the machine vision systems used in industrial applications employ well known image processing algorithms to discriminate between good and bad parts. Algorithms such as thresholding, blob analysis and edge detection, for example, can be found in every machine vision software vendor’s toolbox since they can be used in numerous applications to solve a relatively large number of imaging tasks. Image classification may be performed using supervised, unsupervised or semi-supervised learning techniques. In supervised learning, the system is presented with numerous examples of images that must be manually labeled. Using this training data, a learned model is then generated and used to predict the features of unknown images. Such traditional supervised learning techniques can use either generative or discriminative models to perform this task. In this dissertation, UID techniques are used in an optimized manner to represent an image in the form of a vector in finite dimensions. The distance between this representation and that of a prototype image is computed to find the similarity score between the images. This mating score can be used to train any machine learning system under supervised or unsupervised environment. In this dissertation, an SVM based classifier is trained using feature vectors to train a classifier in a supervised environment. The precision and accuracy of the machine is computed over the benchmark techniques of image classification. The overall performance of the proposed methods is evaluated using R simulator in terms of precision, recall and kappa measure. Simulation results establish the validity and efficiency of the approach.
Full Paper

IJCST/71/1/A-0643
22

A Novel Density based K-means Clustering for Test Case Prioritization in Regression Testing Result-II

Dr. Amit Verma, Rohit Bajaj, Ishadeep Kaur Luthra

Abstract

In this paper, we work on improving the test case prioritization on the basis of clustering approach. A novel density based k-means clustering approach is used to make clusters of different test cases on the basis of statement coverage. Then, prim’s algorithm is used to find out the minimum path between different test cases according to their coverage information. Test cases are select from every cluster; which have maximum coverage information. According to Prim’s algorithm, we will find the tree of test cases; this technique reduces the test cases numbers. Only those test cases are selected which have maximum coverage information. It will reduce the effort, cost and time also.
Full Paper

IJCST/71/1/A-0644
23

Improved Caesar Cipher Algorithm Using Multistage Encryption

Greetta Pinheiro, Shruti Saraf

Abstract

Cryptographic algorithms play an important role in the security domain. In this system, in order to increase the security of the Caesar cipher, some basic mathematical calculations are performed on the cipher text in order to make it strong. The proposed new system is case sensitive. The encryption and decryption of the plain text is done by making use of the face values and positional values of the corresponding characters as the key. The multistage
encryption is imposed on the plain text which indeed improves the security of the plain text and secures it from brute force attack, pattern matching and frequency analysis to an extent. Further discuss the need of the additional methodology to the existing scenario.
Full Paper

IJCST/71/1/A-0645
24

A Survey on the Atmospheric Effects on Radio Path Loss in Cellular Mobile Communication System

C S Hanchinal, Dr. K.N.Muralidhara

Abstract

Radio propagation is essential for emerging technologies with appropriate design, deployment and management strategies for any wireless network. It is heavily site specific and can vary significantly depending on terrain, atmospheric effects, frequency of operation, velocity of mobile terminal, interface sources and other dynamic factor.
Accurate characterization of radio channel through key parameters and a mathematical model is important for: Predicting signal coverage, achievable data rates, network planning, Quality of service, hand over performance, etc. Efficiency of present path loss models for cellular communication system suffers when they are used in the environment other than for which they have been used. Accurate path loss can be determined by measuring strength of signal through site specific field measurements.
Full Paper

IJCST/71/1/A-0646
25

Critical Success Factors for Implementing Customer Relation Management (CRM) System within University Context Concepts and Literature Review the Gulf Region Perspective

Dr. Ashraf Badawood

Abstract

Higher education institutions all over the world are currently going through important changes on how they interact with students, donors, alumni and researchers. Most of these institutions are mainly focused on ways to cut costs and become efficient in the way they respond to customers’ needs and hence gain competitive advantage. This study therefore examines the critical success factors (CSFs) which HEIs particularly those in the Gulf region in Middle East should consider in order to successfully implement the CRM system and help them achieve their objectives. Literature review was used to collect data for this study where peer reviewed journals were the main sources used. Despite several studies
indicating different CSFs, this study has focused on four factors; people, technology, process and culture. Further, recommendations on how universities in the Gulf region should implement CRM successfully have been provided.
Full Paper

IJCST/71/1/A-0647
26

EARL: Effective Augmented Reality Learning

Nisarg M. Vasavada, Dhwani P. Sametriya, Dipika S. Vasava

Abstract

Learning is a process of adapting facts as effectively as possible. While the subjects can span from Sociology to Sensor Networks, their learning only becomes effective when the information is both delivered and interpreted properly. With existing technology it is possible to adapt a breakthrough Teaching-Learning system which is targeted to fill the loopholes of how a student receives, interprets and applies the information. In this paper EARL (Effective Augmented Reality based Learning), a long term cost effective
Augmented Reality (AR) solution is proposed along with brief introduction to AR. The proposed solution not only focuses on system implementation but also emphasizes the modification in the contents of information to be delivered. The proposed system architecture and suggested implementation strategies are discussed in details along with its challenges and future scopes.
Full Paper

IJCST/71/1/A-0648
27

Ontology Based Web Crawler for Specific Domain

Swasti Singhal, Neeraj Kumar

Abstract

As available of large amount of data in World Wide Web (www), browse of any specific domain related topic takes too much time and also contain irrelevant web pages which are not undesirable. For crawler it is not easy task to download only data mining related web pages. Ontology is the technique to access only data mining related web pages or domain specific pages. So the basic goal of “ontology based web crawler for domain specific” is to select and seek out the web pages that fulfill user’s requirement for example data mining related web pages. The link analysis algorithms like page ranking algorithm and other metrics are use to prioritize the URLs based on their ranking and selection policies
for downloading most specific web pages.
Full Paper

IJCST/71/1/A-0649