VOL5.2-2, Apr to June, 2014

International Journal of Computer Science and Technology Vol 5.2-2
S.No. Research Topic Paper ID

A New Key Management Transmission Paradigm to Remote Cooperative Groups

T Sundararajulu, Erasappa Murali


The problem of efficiently and securely broadcasting to a remote cooperative group occurs in many newly emerging networks. A major challenge in devising such systems is to over-come the obstacles of the potentially limited communication from the group to the sender, the unavailability of a fully trusted key generation center and the dynamics of the sender. The existing key management paradigms cannot deal with these challenges effectively. In this paper, we circumvent these obstacles and close this gap by proposing a novel key management paradigm. The new paradigm is a hybrid of traditional broadcast encryption and group key agreement. In such a system, each member maintains a single public/secret key pair. Upon seeing the public keys of the members, a remote sender can securely broadcast to any intended subgroup chosen in an ad hoc way. Following this model, we instantiate a scheme which is proven secure in the standard model. Even if all the non-intended members collude, they cannot extract any useful information from the transmitted messages. After the public group encryption key is extracted, both the computation overhead and the communication cost are independent of the group size. Further, our scheme facilitates simple yet efficient member deletion/addition and flexible rekeying strategies. Its strong security against collusion, its constant overhead, and its implementation friendliness without relying on a fully trusted authority render our protocol a very promising solution to many applications.


Performance Evaluation of Edge Detection Operators for Character Extraction

RanaGill, Chandandeep Kaur


Segmentation of images can be implemented using different fundamental algorithms like Edge Detection (Discontinuity based Segmentation), Region Growing (Similarity Based Segmentation), Iterative Thresholding Method. A comprehensive literature review relevant to the study gives description of different techniques for vehicle number plate detection and edge detection techniques widely used on different types of images. This research work is based on edge detection techniques and calculating threshold on the basis of five edge operators. Five operators used are Prewitt, Roberts, Sobel, LoG and Canny. Segmentation of characters present in different type of images like vehicle number plate, name plate of house and characters on different sign boards are selected as a case study in this work. The proposed methodology has seven stages. The proposed system has been implemented using MATLAB R2010a.Comparison of all the five operators has been done on the basis of their performance. From the results it is found that canny operators produce best results among the used operators and Performance of different edge operators in decreasing order is. Canny>Log>Sobel>Prewitt>Roberts


Tate Pairing in Cryptography

K.M.Jadhav, S.R.Idate


In this paper the concept of Tate pairing described, in detail. Tate pairing over elliptic curves is important in cryptography. Tate pairing is one of the algorithms used in Pairing-based cryptography. It means pairing between elements of two cryptographic groups to a third group to construct cryptographic systems. If the same group is used for the first two groups, the pairing is called symmetric and is mapping from two elements of one group to an element from a second group. In this way, pairings can be used to reduce a hard problem in one group to a different, usually easier problem in another group.


A Review: Architectures and Localization Techniques for Underwater Acoustic Sensor Networks

Divya Bhargava, Nishant Anand


Wireless information transmission through the ocean is one of the emerging technologies for the expansion of future aqueousobservation systems and sensing networks. Applications of underwater sensing range from oil production to aquaculture, and comprise instrument monitoring, pollution control, weather recording, and prediction of natural disturbances, search and inspect missions, and study of marine life. Underwater sensor networks have drawn lot of interest because of many underwater applications. We put forward a hierarchical approach which divides the whole localization process into two sub-processes: anchornode localization and ordinary node localization. Numerous existing techniques have been used in the past. This paper is about the localization problem in underwater sensor networks. The unfavorable aqueous environments, the node mobility, and the large network scale pose various new challenges and most current localization schemes are not applicable with human operation underwater being very much limited.


A Survey of Density Based Clustering Algorithms

Rashi Chauhan, Pooja Batra, Sarika Chaudhary


Clustering means dividing the data into groups (known as clusters) in such a way that objects belonging to a group are similar to each other but they are dissimilar to objects belonging to other groups. This paper is intended to give a survey of density based clustering algorithms in data mining.Various density based clustering algorithms reviewed are: DBSCAN, OPTICS and DENCLUE.


A Review: Mouse Gesture Recognition Techniques using Neural Networks

Shivani Jaryal, Dimple Arya


Understanding mouse motions can be posed as a pattern recognition problem. In order to convey pictorial messages to a receiver, a mouse expresses motion patterns. Loosely called gestures, these patterns are variable but dissimilar and have an associated significance. The Pattern recognition by any computer or machine can be executed via various methods such as (HMM) Hidden Markov Models, (LP) Linear Programming and (NNs) Neural Networks. Each method has its own benefits and drawbacks, which will be studied distinctly later. This paper reviews why using ANNs in particular is better suited for analyzing mouse gesture patterns.


Document Clustering with Classification in Correlation Similarity Measure Space

K.Jaya Bala Ratheesh, R.Lotus


This paper presents a new spectral clustering method called Correlation Preserving Indexing (CPI), which is performed in the correlation similarity measure space. In this framework, the documents are projected into a low-dimensional semantic space in which the correlations between the documents in the local patches are maximized while the correlations between the documents outside these patches are minimized simultaneously. Since the intrinsic geometrical structure of the document space is often embedded in the similarities between the documents, correlation as a similarity measure is more suitable for detecting the intrinsic geometrical structure of the document space than euclidean distance. Consequently, the proposed CPI method can effectively discover the intrinsic structures embedded in highdimensional document space. The effectiveness of the new method is demonstrated by extensive experiments conducted on various data sets and by comparison with existing document clustering methods.


Cost Model for Object Serialization Using XML and JSON Formatters

Pooja Manocha, Rahul Kadian


The paper presents Object Serialization Methods that can be useful for several purposes, including object serialization Minimization which can be used to fall the size of serialized data. We have implemented means by serialization and de-serialization of object can be done using modern format XML and JSON after adding compression or encryption or possibly both to the object streams.


Software Analytics for Mobile Application

R.S. Patil, S. R. Idate


Mobile applications, known as apps, are software systems running on portable devices, such as smart phones and tablet PCs. The market of apps has rapidly expanded in the past few years into a multi-billion dollar business. Being a new phenomenon, it is unclear whether approaches to maintain and comprehend traditional software systems can be ported to the context of apps. We present a novel approach to comprehend apps from a multiple domains of software analytics to mining software repositories, to software visualization. Software analytics is to enable software practitioners to perform data exploration and analysis in order to obtain insightful and actionable information for data-driven tasks around software and services. The app store mining and analysis as a form of software repository mining. Unlike other software repositories traditionally used in mining software repository (MSR) work. However, they do provide a wealth of other information in the form of pricing and customer reviews. Therefore, in this system data mining is used to extract feature information, which is then combined with more readily available information to analyze apps’ technical, customer and business aspects. This approach is applied to the 32,108 nonzero priced apps available in the Blackberry app store. Then an approach is introduced based on a combination of software visualization and software metrics. Software visualizations enriched with software metrics, which is called as polymetric views. The polymetric views, i.e., shapes in visualizations depict a set of chosen software metrics.


A Review of Automated Cloudlet Resource Allocation in Virtualized Cloud Systems

Kanika Takkar, Neha


Cloud computing has become a new age technology enormous potential in enterprises and global markets. Only clouds can make it possible to access applications and associated data from anywhere seamlessly. Companies are able to rent resources from cloud for storage and other computational purposes so that their infrastructure cost can be reduced significantly. However the the challenges are also immense as users requesting number of cloud services simultaneously, resources must be made available to requesting user in efficient manner to satisfy their need without compromising on the performance of the resources. The other challenges of resource allocation are meeting customer demands and application requirements.


Review Paper on Different Methods for Doppler Spectrogram Calculation

Harsha S. Jain, Ashwini G. Andurkar


The Doppler Echocardiography is most widely used technique for diagnosis of blood flow abnormalities in heart and valve functioning. It uses principle of Doppler shift for this diagnosis. The Doppler shift obtained by blood flow is proportional to the velocity of blood flow. So by obtaining the frequency shift value one can easily obtain the different clinical indices of blood flow. For all this the spectrogram must be required. Any type of noise may degrade readability of spectrogram. Thus, there is need of efficient algorithm to calculate spectrogram. This paper focuses on different methods of Doppler’s spectrogram calculation.


Empirical Test Case Minimization in Software Models

Neha Pahwa, Pawan Garg, Shakti Nagpal


Test case minimization provides facilities to execute test cases with priorities. Many empirical studies shown that test case minimization can improve the complete test suite and test fault detection can also be heavily improved. This survey is about Empirical studies and related work in CBSE testing and including challenges that testers encounter during the development and testing phase of the systems..


A Framework For Video Streaming and Sharing of Video in Clouds

M. Kiran Mayee, M.Mamatha


Video Streaming has been a interesting topic for many years, and the technology is now developed to offer such services to the endusers over the Internet. Video streaming is based on the stream of traffic over networks. The capacity of wireless link cannot be provided on demand. The user request for videos through wireless links, this links capacity cannot be concert with the traffic demand. In this paper we proposes to develop a new video framework, labeled as AMES-Cloud, which is divided into two main parts: AMoV (adaptive mobile video streaming) and ESoV(Efficient social video sharing). AMoV and ESoV buildup the third party agent to provide efficient video streaming services for every user. Their private agents try to prefetch video content in advance. In this paper we implement a prototype of the AMES-Cloud framework to show its efficiency. For that we propose to show that the third party agents in the cloud can effectively provide the adaptive streaming and perform video sharing.


A New PDP Estimation Technique for the MIMO-OFDM System

Allanki. Sanyasi Rao, S.Srinivas


In this paper we propose a new technique for the pilot assisted power delay estimation for the LMMSE channel estimator for the multiple inputs and multiple output OFDM system. The distortions due to the null subcarriers and the distortions due to the insufficient samples are also considered. The simulation results show that the proposed PDP estimator gives the good performance in estimating when compared to the weiner filter and kalman filtering.


Prototype Based Supervised and Scalable Re-ranking of Images Through CBIR

T.Srilatha, K. Neeraja


The existing ways for image search reranking suffer from the undependableness of the assumptions below that the initial textbased image search result’s used within the re ranking method. During this paper, we tend to propose a prototype-based reranking technique to deal with this drawback in a very supervised andscalable fashion.The everyday assumption that the top-Npictures within the text-based search result area unit equally relevant is relaxed by linking the relevancy of the pictures to their initial rank positions. we tend to use variety of pictures from the initial search result because the prototypes that serve to visually represent the query that which area unit afterwards is accustomed to construct meta rerankers. By applying completely different metarerankers to a picture from the initial result, reranking scores area unit is generated, that area unit then mass employing a linear model to supply the ultimate relevancy score and therefore the new rank position for aimage within the reranked search result will be displayed. Human superintendence is introduced to find out the model weights offline, before the net reranking method. Whereas model learning needs manual labeling of the results for a number of queries, the subsequent model is query freelance and so applicable to the other query. The experimental results on a representative internet image search dataset comprising 353 queries demonstrate that the projected technique outperforms the present supervised and unsupervised reranking approaches. Moreover, it improves the performance over the text-based image computer program by over twenty five.


A Supervised Fuzzy Clustering Approach for Churn Prediction

Vijaya Geeta Dharmavaram


Customer attrition or Churn as it is popularly known in cellular industry is one of the major problems faced by the industry. Since it is often more expensive to acquire new customer than retaining one, timely prediction of customer churn will provide huge dividends in building appropriate customer retention programs thus prompting for a churn prediction model. Most of the churn prediction models proposed in the past tends to classify a customer strictly, as ‘Churner’ or ‘Non-Churner’, which is not appropriate since customer behavior are often fuzzy. To address the problem of fuzziness of a customer, a fuzzy approach is proposed. The proposed approach calculates the ‘Churn Index Score’ of each customer that reflects the degree of likelihood of churn of that customer. Based on the Churn Index Score, the customers can be targeted for retention programs. The performance of the model was tested on customer data and the results are encouraging, based on cumulative gains and lift chart.


Enhanced Surface Reconstruction Using Delaunay Triangulation Algorithm

Abhishek Bansal, Vishal Arora, Lovish Jaiswal


The goal of surface reconstruction is to find a surface from a given finite set of cloud points of pixel. We present a greedy delaunay algorithm for surface reconstruction from unorganized point sets. Surface reconstruction is a highly challenging problem because sets of scattered points lack ordering information, connectivity, and may be noise contaminated. Delaunay triangulation is a common method for domain triangulation. There are other algorithms for triangulating a point set in two or three dimensions, but all of them are not suitable for surface modeling. A triangular surface mesh homeomorphic to the original surface can be extracted directly from the tetrahedral mesh provided a sufficient sampling density exists. Reverse engineering of geometric shapes is the process of converting a large number of measured data points into a concise and consistent computer representation. In this sense, it is the inverse of the traditional CAD/CAM procedures, which create physical objects from CAD models.


Implementation of a Noble Approach in the Watermarking for the Secure Transmission of the User’s Data

Shikha Goyal, Dr.P.C. Gupta


Watermarking process tags the images ,holograms, signatures or other symbols in the background of paper attached with text, images, audio, video or with any other which needs to show that transmitted cover of communication is authenticated and authorized. It uses user specified bitplane to hide watermark image into cover image. It adds noise and recovers the watermark from the noise-free and noisy images. In this paper Least Significant Bit has been implemented to authenticate and validate the transmission. MATLAB is used to write the code and perform the experiments and obtain the result.


Management of Network Elements through Web UI Using Dojo and REST Calls

Naveen Ladwa, Kavitha S N


This paper focus the concept of collecting performance values of network elements which are stipulated at different geographical locations. It aims building a web User Interface through which one can monitor and manage the behavior of network elements. Dojo is used to build the user interface since it provides better security and cross platform support. It is an open source modular JavaScript library designed to ease the rapid development of JavaScript/ Ajax-based applications and web sites. REST is an architecture style for designing networked applications. It is used to connect between machines or network elements with a simple HTTP to make calls between machines. The network elements in a optical management system can be managed through user interface.


Privacy Preserving Through Search Logs

Dr G. Rama Krishna, P.Mounika


Today search engine companies collect the user information through search logs for the purpose of advertising, knowing the behavior of a distinct person. The obstacle of this process is to preserve the privacy of a user. For this paper, we examine the algorithms for releasing frequent keywords, queries and clicks of a search log. In the previous methods to provide security for the search log, name as k-anonymity provide insecure to active attacks. And another method is used for the equal purpose guarantee ensured by differential privacy does not provide any efficiency for this problem. This paper proposes the observational study using real utilization compare the ZEALOUS and previous work that attains k-anonymity in search log publishing. With this ZEALOUS provides an adequate results and provide strong utility guarantee than the k-anonymity.


A Review Paper on ZigBee- New Era in Agricultural Monitoring

Priyanka Yadav, Krishan Kumar, Rashmi Gupta


Agriculture is the backbone of the world’s economy. It provides a way of livelihood to a large part of the world. Similarly, it has great impact on the world’s economy. The farmers face several problems due to inadequate rainfall, drought, pests, increased moisture and other reasons in crops. A traditional approach to measure all these factors is that the farmer; he needed to visit the field and checking them at regular intervals of time. But now, the agricultural process becomes extremely crucial for socio-economic change worldwide, therefore, the agricultural techniques are needed to be improved. The ZigBee technology emerges as the new standard for improved wireless technology for field area. In our paper, we presented the idea about how the ZigBee can be utilized for agricultural monitoring and explained about the ZigBee technology, the architecture and protocol stack of ZigBee.


A Methodology for Very Low Power and Small Size Capsule Manufacturing For Wireless Capsule Endoscopy

Preeti Rathore, Niraj Kumar


We are presenting a method of reducing the size of traditional capsule used in Wireless Capsule Endoscopy (WCE). The conventional wireless endoscopic capsule was manufactured on 18μm technology and its size is 26mm x 11mm. Here we are using Ultrascale FinFET 16nm technology (website) to reduce the size of capsule. By using this technique the Size of traditional capsule can be reduced by 11-12 % .Using this method leakage current content reduces and power consumption also reduces up to 50% than the traditional one. The device can be operated on a very low operating voltage and thus the life of battery increases. The more the battery life, more images can be captured for examination and useful in disease diagnosis. To store captured images memory is required and using FinFET 16 nm technology, buffer size can also be reduces .The use of Tri-gate or multigate MOSFETs further reduces leakage current, power consumption and this method increases system performance and speed upto 2 folds and speed increases up to 30%.


Data Hiding by Using Image with LSB Algorithm

Akshara Srivastava, Dr. N. Chandra Sekhar Reddy, Er. Ramana Sai Poloju


The attention being paid to Reversible Data Activity (RDH) in encrypted pictures is increasing day by day as it maintains the excellent property that the initial cowl will be losslessly recovered when embedded knowledge is extracted while protecting the image content’s confidentiality. All previous strategies plant knowledge by reversibly vacating space from the encrypted pictures, which may be subject to some errors in knowledge extraction and/or image restoration. During this paper, we are proposing a completely unique technique to reversibly plant data in the encrypted image by reserving room before coding with a conventional RDH algorithmic program. The planned technique can do real reversibility, that is, knowledge extraction and image recovery are free of any error. Experiments show that this novel technique will plant more than ten times as giant payloads for identical image quality as the previous strategies.


Ensuring Security and Accountability for Decentralized Data in the Cloud

Ali Habeeb, Yousef Emami


Cloud computing allows extremely scalable services to be simply consumed over the web on AN as-needed basis. A major feature of the cloud services is that users’ information square measure typically processed remotely in unknown machines that users don’t own or operate. While enjoying the convenience brought by this new rising technology, users’ fears of losing management of their own information (particularly, financial and health data) will become a big barrier to the wide adoption of cloud services. to handle this downside, in this paper, we tend to propose a completely unique extremely decentralized information accountability framework to stay track of the particular usage of the users’ data within the cloud. Particularly, we tend to propose AN objectcentered approach that permits insertion our work mechanism in conjunction with users’ information and policies. We tend to leverage the JAR programmable capabilities to each produce a dynamic and traveling object, and to make sure that any access to users’ information can trigger authentication and automatic work native to the JARs. To strengthen user’s management, we also provide distributed auditing mechanisms. We offer in depth experimental studies that demonstrate the potency and effectiveness of the planned approaches.


Disease Prediction in Data Mining Techniques

K. Aparna, Dr. N. Chandra Sekhar Reddy, I. Surya Prabha, Dr. K. Venkata Srinivas

AbstractData mining (sometimes called Knowledge Discovery in Databases) is the process of analyzing data from different databases and Converting into useful information that can be used to predict the Datasets and extract Knowledge to take or make decisions infuture transactions. Mining software is one of number of analytical tools for analyzing data. It allows users to analyze data from many different dimensions or angles, categorize it, and summarize the relationships identified. Technically, data mining is the process of finding correlations or patterns among dozens of fields in large relational databases. Now a day’s, the availability of huge amounts of E-medical data, PHR data leads to the need for powerful data analysis tools to extract useful knowledge. The researches have been done for Disease Predictions, which have shown concern results. Heart disease is the leading cause of death all over the world in the past ten years. Several researchers are using statistical and data mining tools to help health care professionals in the diagnosis of heart disease. Using single data mining technique in the diagnosis of heart disease has been comprehensively investigated showing acceptable levels of accuracy. In this paper we are proposing a hybrid data mining techniques for prediction and diagnosis of Heart disease.


Providing OTP Authentication for Mobileapplications Using Web Application

HimaBindu.K, Dr. N. Chandra Sekhar Reddy, I.Surya Prabha, Dr. K. Venkata Srinivas


Text password is the most common form of user authentication on websites due to its simplicity and convenience. Users passwords are easy to be stolen and compromised under different threats. Two types of mistakes are commonly done by the users. Firstly, users select weak passwords and reuse the same passwords across different websites. Secondly typing the passwords into un trusted computers leads to password threat and theft. In this paper, we design a user authentication protocol named Opass which uses a user’s cell phone and short message service to thwart password stealing and reuse attacks . Opass only requires each participating website to possess a unique phone number and involves a telecommunication service provider in registration and recovery phases. Reusing passwords across different web sites may cause users to lose their information which is stored in web sites once the password hacked or compromised by attacker. Second, hackers can install malicious software to get the passwords, when user typing their username and password into unknown public computers. In this paper, developing web based security analysis of one Time password authentication schemes using mobile application. A user authentication protocol which involves user’s cell phone and short message service to prevent password stealing and reuse attacks. User’s only need to remember a long term password for login on different websites


An Efficient and Scalable Data Possession in Cloud by Using AES

Sangeeta Raga, Dr.N. Chandra Sekhar Reddy, G.Praveen Babu, I.Surya Prabha


Cloud computing is the delivery of computing and storage capacity as a service to a community of end-recipients. Cloud computing entrusts services with a user’s data, software and computation over a network. Cloud storage enables users to remotely store their data and enjoy the on-demand high quality cloud applications without the burden of local hardware and software management. Though the benefits are clear, such a service is also relinquishing users, physical possession of their outsourced data, which unavoidably poses new security risks toward the correctness of the data in cloud. In order to address this new problem and further achieve a secure and dependable cloud storage service, we propose in this paper a flexible distributed storage integrity auditing mechanism, utilizing the homomorphism token and distributed erasurecoded data. The proposed design allows users to audit the cloud storage with very lightweight communication and computation cost. The auditing result not only ensures strong cloud storage correctness guarantee, but also simultaneously achieves fast data error localization, i.e., the identification of misbehaving server. Considering the cloud data are dynamic in nature, the proposed design further supports secure and efficient dynamic operations on outsourced data, including block modification, deletion, and append. Analysis shows the proposed scheme is highly efficient and resilient against Byzantine failure, malicious data modification attack, and even server scheming attacks.


Health Monitoring Scheme Based on Cloud Computing

P.Shweta, Dr.N. Chandra Sekhar Reddy, Venkata Ramana Murty Tangella


Cloud-assisted mobile health (mHealth) observation, which applies the prevailing mobile communications and cloud computing technologies to produce feedback call support, has been thoughtabout s a revolutionary approach to rising the quality of health care service whereas lowering the health care price. Unfortunately, it additionally poses a heavy risk on each clients privacy and holding of observation service suppliers, which could deter the wide adoption of mHealth technology. This paper is to handle this necessary drawback and style a cloud-assisted privacy protective mobile health observation system to safeguard the privacy of the concerned parties and their information. Moreover, the outsourcing decryption technique and a new planned key personal proxy re encryption area unit tailored to shift the procedure quality of the concerned parties to the cloud while not compromising clients’ privacy and repair providers’ holding. Finally, our security and performance analysis demonstrates the effectiveness of our planned style.


Self Motivation for Providing Allocation Using Virtual Machine for Cloud Computing Environment

P.Shraddha, Dr. N. Chandra Sekhar Reddy, I.Surya Prabha, Dr. K. Venkata Srinivas


Cloud” computing is a relatively recent term, defines the paths ahead in computer science world. Being built on decades of esearch it utilizes all recent achievements in virtualization, distributed computing, utility computing, and networking. It implies a service oriented architecture through offering software and platforms as services, reduced information technology overhead for the end-user, great flexibility, reduced total cost of ownership, on demand services and many other things. Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Cloud computing describes a new supplement, consumption, and delivery model for IT services based on Internet protocols, and it typically involves provisioning of dynamically scalable and resources. Cloud computing uses multiple server computers via a digital network as though they were one computer. In cloud computing, cloud providers can offer cloud consumers two provisioning plans for computing resources, namely reservation and on-demand plans. In general, cost of utilizing computing resources provisioned by reservation plan is cheaper than that provisioned by on-demand plan, since cloud consumer has to pay to provider in advance. With the reservation plan, the consumer can reduce the total resource provisioning cost. However, the best advance reservation of resources is difficult to be achieved due to uncertainty of consumer’s future demand and providers resource prices. The consumer needs to access the on-demand Services due to the fluctuated and unpredictable demands. Here we are proposing a Server analysis process before a user access on-demand services from the cloud server. The server which suits the users need and capable of doing the task in less time is chosen for executing the user’s task. As in on-demand services the user pays on basis of pay per uses basis. As Time decreases the cost of the on-demand services decreases.


Finding Guilty Agent by Using Fake Object Allocation

M.Srilekha, Dr.N. Chandra Sekhar Reddy, Dr.G. Poshal


Data hold on in associate organization’s computers is very necessary and embodies the core of the organization’s power. A corporation without doubt desires to preserve and retain this power. On the opposite hand, this knowledge is important for daily work processes. Users among the organization’s perimeter (e.g., employees, subcontractors, or partners) perform numerous actions on this knowledge (e.g., query, report, and search) and will be exposed to sensitive data embodied among the info they access. In an attempt to work out the extent of harm to a corporation that a user will cause mistreatment the knowledge she has obtained, we have a tendency to introduce the thought of Misuseability Weight. The M-score live is customized for tabular knowledge sets (e.g., result sets of {relational knowledgebase electronic database online database computer database electronic information service} queries) and can’t be applied to non tabular data like property, business plans, etc. it’s a site freelance live that assigns a score, that represents the misuseability weight of every table exposed to the user, by employing a sensitivity score perform from the domain professional. By distribution a score that represents the sensitivity level of the info that a user is exposed to, the misuseability weight will verify the extent of harm to the organization if the info is misused mistreatment this data, the organization will then take acceptable steps to stop or minimize the injury.


A Network and Device Based Mobile Streaming

Srinu A, Dr.N. Chandra Sekhar Reddy, T V Ramana Murthy


Cloud transmission services provide associate economical, flexible, and scalable methodology technique and provide a solution for the user demands of top of the range and wide-ranging transmission. As intelligent mobile phones and wireless networks become any and further ancient, network services for users do not appear to be any more restricted to the house. Transmission data ar obtained merely victimization mobile devices, allowing users to fancy gift network services. Considering the restricted system of measurement accessible for mobile streaming and fully absolutely whole completely different device desires, this study given a network and device-aware Quality of Experience(QoE) and Quality of Service (QoS) approach that has transmission info applicable for a terminal unit surroundings via interactive mobile streaming services, plenty of considering the last word network surroundings and adjusting the interactive transmission frequency and put together the dynamic transmission transcoding, to avoid the waste of knowledge live and terminal power. Finally, this study complete a model of this vogue to validate the standard of the projected technique. in step with the experiment, this system might provide economical self-adaptive transmission streaming services for numerous system of measurement environments.


Balancing the Load in Distributed File Systems With Cloud Computing

M.Sujitha, Dr.N. Chandra Sekhar Reddy, N.V.Krishna Rao


Distributed file systems are key building blocks for cloud computing applications supported the MapReduce programming paradigm. In such file systems, nodes at an equivalent time serve computing and storage functions; a file is divided into sort of chunks assigned in distinct nodes so as that MapReduce tasks are going to be performed in parallel over the nodes. However, in a cloud computing setting, failure is that the norm, and nodes might even be upgraded, replaced, and adscititious inside the system. Files may be dynamically created, deleted, and appended. This finishes up in load imbalance throughout a distributed file system; that is, the file chunks do not appear to be distributed as uniformly as achievable among the nodes.Rising distributed file systems in production systems powerfully rely on a central node for chunk reallocation. This dependence is clearly inadequate in an exceedingly large-scale, failure-prone surroundings as a result of the central load balancer is place to sleep goodish employment that is linearly scaled with the system size, and can therefore become the performance bottleneck and conjointly the one purpose of failure. Throughout this paper, a completely distributed load rebalancing algorithmic program is bestowed to cope with the load imbalance downside. Our algorithm is compared against a centralized approach throughout a production system and a competitive distributed answer conferred within the literature. The simulation results indicate that our proposal is comparable this centralized approach and considerably outperforms the previous distributed algorithm in terms of load imbalance issue, movement price,and algorithmic overhead. The performance of our proposal enforced inside the Hadoop distributed arrangement is any investigated in a cluster setting.


Test Case Minimization in Component Based Software Models

Arvind Kumar, Dr. Sushil Garg


Test case Minimization techniques try to schedule test cases in an execution order according to given criterion. The main purpose of this Minimization is to increase the likelihood that if the test cases are used for regression testing in the given order, they will more closely meet the objective than they would if they were executed in some other order. For instance, testers might schedule test cases in an order that achieves code coverage at the fastest rate possible, exercises characteristics in order of expected frequency of use, or increases the probability of detecting faults early in testing.