Volume-12 Issue-1 - Sla Based Rescheduling Of Task For Optimum Allocation Of
Resources In Cloud Computing Environment
|
Ms. Nikky Ahuja, Dr. Priyesh Kanungo, Dr. Sumant Katiyal
|
Volume-12 Issue-1 - Detection of Copy-Move Forgery using Normalized Cross
Correlation and Fast Fourier Transform
|
Ms. Apoorva Katyayen, Dr. Ajay Khuteta
|
Volume-12 Issue-1 - A Comparative Study of Andrew File System and Hadoop
Distributed File System Framework to Manage Big Data
|
Mr. Rajesh Savaliya, Dr. Akash Saxena
|
Volume-12 Issue-1 - Integrated Development Environment for IoT-Based Sensor
Data Processing in Smart City
|
Dr. Kavita Ahuja
|
Volume-12 Issue-1 - A Feature Based Semi-Fragile Watermarking Mechanism for
Gray-Scale Digital Image Authentication.
|
Ms. Hiral Patel
|
Volume-12 Issue-1 - Preventing CSRF Attacks by Verifying Redirection Request
and User Session
|
Dr. Purva Desai
|
Volume-12 Issue-1 - Cluster Validation of Evolutionary Clustering Algorithm for
Multivariate Datasets
|
Ms. Jyoti Lakhani, Dr. Ajay
Khunteta, Dr. Anupama
Chowdhary, Dr. Dharmesh
Harwani
|
Volume-11 Issue-2 - Two New Eigenvector-Based Approaches To Assign Weights
To Decision Makers In Group Decision Making Under
Multiple Criteria
|
Mr. Mohammad Azadfallah
|
Volume-11 Issue-2 - Offline Gaming Vs Cloud Gaming
|
Mr. Amit Khatri
|
Volume-11 Issue-2 - An Open Source Threat Detection Engine With Visualization
Framework To Uncover Threats From Offline Pcap Files
|
Amit Mahajan, Dr. Maninder Singh, Dr. Vibhakar Mansotra
|
Volume-11 Issue-2 - Discrete Cosine Transform For Script Identification And Character Recognition
|
Mr. Shailesh Chaudhari
|
Volume-11 Issue-1 - Innovative Feature Selection For Effective Context Resolution Using Natural Language Query Interface
|
Dr. AmishaShingala and Dr. PritiSajja
|
Volume-11 Issue-1 - Research Review On Feature Extraction Methods Of Human Being's X-Ray Image Analysis
|
Mr. Anil K. Bharodiya and Prof. Dr. Atul M. Gonsai
|
Volume-11 Issue-1 - Indian Sign Language Recognition System For Deaf And Dumb Using Image Processing And Fingerspelling: A Technical Review
|
Mr. Rakesh Savant and Ms. Amrutha Ajay
|
Volume-11 Issue-1 - Medical Image Enhancement Through Deep Learning Methods
|
Mr. Shivang M. Patel and Dr. Jyotindra N. Dharwa
|
Volume-11 Issue-1 -Detection Of Malignant Melanoma With Supervised Learning: A Review
|
Ms. Neena Agrawal and Mr. Vineet Khanna
|
Volume-11 Issue-1 - Optical Character Recognition Using Deep Learning Technical Review
|
Ms. Preeti P. Bhatt and Ms. Isha Patel
|
Volume-11 Issue-1 - Document Classification: A Technical Review
|
Mr. Manish Vala
|
Volume-10 Issue-2 - Multilayer Architecture of Parallel - Genetic - Fuzzy System:
A Case of Effective Transportation for Co-Operatives in India
|
Dr. Priti S. Sajja
|
Volume-10 Issue-2 -A Review on Learning Repositories and Fuzzy XMLin Education Field
|
Ms. Mona G Dave, Prof. (Dr.) P V Virparia
|
Volume-10 Issue-2 - A Proposed EDM Framework for Improving Student Performance
|
Mr. Bhavesh R. Patel
|
Volume-10 Issue-2 - A Comparative Analysis of Different Measurement Scale and Normalization
Method Performances in Electre Method
|
Dr. Mohammad Azadfallah
|
Volume-10 Issue-2 - Comparison of String Similarity Algorithms to Measure Lexical
Similarity
|
Mr. Sagar J. Gandhi, Mr. Mihirraj M. Thakor, Dr.Jikitsha Sheth, Mr. Hariom I. Pandit,
Mr. Hemin S. Patel
|
Volume-10 Issue-2 - Study and Comparison of Mobile Banking Applications – Towards
the Trust Perspective
|
Mr. Akoramurthy.B, Ms. Arthi.J
|
Volume-10 Issue-1 - The Impacts of Aggregation Rules, Measurement Scale and Normalization
Methods on Ranking of Alternatives in AHP
|
Dr. Mohammad Azadfallah
|
Volume-10 Issue-1 - Comprehensive Study on Recognition of offline Handwritten Gujarati
Numerals
|
Mr. Bharat C. Patel, Dr. Manish M. Kayasth
|
Volume-10 Issue-1 - Comprehensive Study on Gujarati Handwritten Character Recognition
|
Mr. Jitendra B. Upadhyay, Dr. Kalpesh B. Lad
|
Volume-10 Issue-1 - A Technical Review on Comparison and Estimation of Steganographic
Tools
|
Ms. Preeti P Bhatt, Mr. Rakesh Savant
|
Volume-10 Issue-1 - Detection of Wormhole Attack in Vanet
|
Mr. Parteek Kumar, Mr. SahilVerma, Ms. Kavita
|
Volume-10 Issue-1 - Influence of the Different Measurement Scale and Normalization
Method on Results in Topsis
|
Dr. Mohammad Azadfallah
|
Volume-9 Issue-2 - Prediction based Dynamic Load Balancing Algorithm for Distributed
System
View PDF Abstract
Dynamic load balancing is one of the major requirements of distributed systems for
the effective utilization of resources. Existing dynamic load balancing algorithms
work on the basis of load data of incoming jobs and takes the corrective actions
once the system becomes unbalanced. In this paper we have proposed hybrid architecture
of distributed system which can take benefits of centralized as well as distributed
approaches. We have designed prediction based dynamic load balancing algorithm and
tested in the proposed architecture for checking the correctness. The proposed algorithm
empowers system to take corrective scheduling actions on the basis of prediction
and avoid the problem of unbalanced load in distributed systems.
|
Mr. Devendra V. Thakor, Dr.Bankim C. Patel
|
Volume-9 Issue-2 - Gender and Number Identification for Gujarati word: Rule-Based
Approach
View PDF Abstract
Morphological Analyzer is used to analyze word in a sentence and provide its grammatical
meaning. Morphology is process of studies of internal structure of word, relation
with word and there meaning. Morpheme is smallest unit of grammar in linguistic.
Analyzer is used in the applications like POS tagging, Chunking, text summarization
and Machine Translation. Gujarati is morphologically complex language. In this paper
author present a rule-based morphological analyzer. The feature author focused part
of speech, gender, number, and tense.
|
Mr. Sandeep Maurya, Ms. Khushbu Maurya, Mr. Gaurav Chaudhary,Mr. Pinkesh Patel
|
Volume-9 Issue-2 - A Study of Security of DICOM Images while Exchange over Public
Networks
View PDF Abstract
Advancement in medical industry has greatly contributed in medical imaging with
the advent of PACS (Picture Archiving and Communication Systems) and RIS/HIS (Radiological/
Hospital Information System). Such integrated healthcare systems allow storage and
dissemination of medical data which can be accessed with an ease. These systems
were the pioneer in migration of radiological film based data into electronic format.
Medical images in various formats are acquired, stored, transmitted and displayed
digitally. This has resulted into various issues that threaten the security of digitized
medical data. This paper focuses on the major issues concerned with storage and
sharing of images of any modality over public networks. An attempt is made to present
a detailed study of recentresearch methodologies, their success rates and issues.
In the end of the paper, the proposed system is discussed to meet specific research
requirements.
|
Ms. Chandrakala Chetri, Dr. Subhashchandra Desai, Dr. Kalpesh Lad
|
Volume-9 Issue-2 - Analyzing Spatial Autocorrelation in distribution of soil chemical
properties using MoranÄ‚ËÂÂÄ‚ËÂÄ‚Ë€šÂ¬ĂËÂÄ‚Ë€žËÂÂs I and Variogram for talukas of Surat district
View PDF Abstract
Analysis of spatial distributions of chemical properties (pH, EC and OC) in soils
are often influenced by spatial autocorrelation. While using geostatistical methods
variogram modelling is one of the pre-requisites. This process depends upon distribution
of the data whether it is clustered or dispersed. This then affects the selection
of variogram parameters namely nugget, sill and range which in turn further affects
kriging results. Therefore, this paper proposes use of spatial autocorrelation as
a preprocessing step for geostatistical implications. Five talukas of Surat district
have been taken for study. Moran's I was calculated for each of the three chemical
properties for every taluka. Results were analyzed to check whether there exist
any spatial autocorrelation or no correlation at all. It was revealed that there
existed few clusters whereas most of the talukas had dispersed patterns in terms
of concentration of chemical properties.
|
Ms. Khushbu Mulatani, Mr. Mohammad Aswat, Mr. Mohsin Jadav, Ms. Jaishree Tailor
|
Volume-9 Issue-1 - Face Features Based Side View Face Recognition System
View PDF Abstract
Face recognition is still very challenging issue in surveillance system either in
known environment or in uninhibited. Basic step of face recognition is to observe
human face for further investigation, so in any circumstances face recognition is
challenging issue because human expression and its orientation may vary in each
moment.
Proposed work aims for the development of a highly efficient human face recognition
system that deal with front as well as side view face with normal face expression.
Using face detection algorithm it accumulate only face region in preprocessing task.
Facial components like eyes, nose and lip are extracted from whole face region using
edge detection method. To identify similarity of features, it compares feature point's
coalition by calculating matched points among face feature similarity. Authors'
uses corner detection method which picks boundary pixel from region of interest
and retrieves maximum match value among all store images. Based on threshold value
decides whether human face is recognized or not.
|
Mr. Rohan K. Naik, Dr. Kalpesh B. Lad
|
Volume-9 Issue-1 - Recognition of Offline Handwritten Gujarati using Structural
Features
View PDF Abstract
Development of handwritten optical character recognition for various Indian scripts
is in demand for dissemination of ICT in India. Although there are significant number
of research going on printed Gujarati text but recognition of offline handwritten
Gujarati text is still in its preliminary level. Proper selection of character features
plays an important role in development of character recognition system.
The feature extraction process describe in this paper extracts important features
of characters based on water reservoir formed by character and radial histogram
which is used for identification of the Gujarati consonant characters. Preprocessing
is applied before extracting features from the character.
|
Mr. Jitendra V. Nasriwala, Dr. Bankim C. Patel
|
Volume-9 Issue-1 - A critical study of challenges in educational opinion mining
of text written in Gujarati language
View PDF Abstract
The field of opinion mining has gained much popularity in last few years. Many new
techniques and methods are being developed in different languages like English,
Hindi etc. However, authors have noted that there is no significant progress for
languages like Gujarati. This paper discusses few of the challenges that can be
faced while implementing opinion mining in Gujarati language text including some
language independent challenges.
|
Ms. Himadri H. Patel,Dr. Bankim C. Patel
|
Volume-9 Issue-1 - An Experimental Comparison of Different Rules for Aggregation,
in AHP
View PDF Abstract
The current study presents a comparative analysis of different aggregation rules
(for instance, two traditional aggregation rules; additive & multiplicative and
its new variants the exponential & logarithmic transformation) adopted in Analytic
Hierarchy Process (AHP), by testing them against a problem with a known composite
answer. The result showed the additive aggregation rule seems to be more satisfactory
than the others. In continuation, we generated 100 decision matrixes with random
numbers in the range of 1 to 9 (in consistent to AHP scale). The result indicated
that, in the both introduced new aggregation rules (exponential & logarithmic transformation)
the portion of important alternative values come to other alternatives. Particularly,
to worst alternatives. Therefore, in these transformations, discriminating level
of the most important alternative will decrease.
|
Mr. Mohammad Azadfallah
|
Volume-9 Issue-1 - Mining Search Log for Keyword Stuffing
View PDF Abstract
The core fundamental concept behind keyword stuffing is to avail with combination
of relevant and irrelevant keyword where the semantics of keywords are not matching
which ultimately leads to poor ranking of website. Search engine spider interprets
and detects places in website where the keywords are stuffed whereby considers it
as a negative design element. Websites should be designed with high quality, well-written
content included with keyword stuffing in limited region of website. Even though
keyword stuffing is unlikely to lead to search engine penalties but if penalized
then webmasters can recovered, it could deter human visitors and reduce website
value.
|
Ms. Bhoomika A.Ahir, Ms. Urvashi A.Bhalala, Ms. Ms. Azmeena R.Pathan, Ms. Pinal
L.Gangani Ms. Suhani D.Patel, Ms. PoonamGodhwani
|
Volume-8 Issue-2 - Reducing the gap between two MADM models
View PDF Abstract
Several methods have been proposed for solving Multi-Attribute Decision-Making problems
(MADM). A major criticism of MADM is that different techniques may yield different
results when applied to the same problem. In this paper, we investigate the performance
of two well-known MADM models: 1. AHP, and 2. TOPSIS. Although, there is no exact
way to know which model gives the right answer. But, AHP was selected as the basis
to which to compare the other methods, because it extremely popular in practice.
Then, by changing the separation measures in TOPSIS model from P=2 (Euclidean distance)
to another values (PÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬Â°Ă‚ 2; i.e. 1.1, 1.2, etc., based on Birnbaum, 1998,
p. 185), result are investigated.
|
Mohammad Azadfallah
|
Volume-8 Issue-2 - Blind Watermarking scheme for Gray scale Images using DCT
View PDF Abstract
Digital watermarking scheme is used to protect copyrights of owners. We proposed
blind watermarking scheme based on DCT (Discrete Cosine Transform) algorithm. The
proposed method provides hiding a watermark in Gray Scale Image. Digital color image
is transformed into Gray Scale Image then the DCT algorithm is used for watermarking
process. Block based DCT algorithm used for embedding watermark. For embedding watermark,
all bits of the watermark image are embedded in various DCT blocks. To extract watermark
same process will be done in inverse manner. In such watermarking scheme, it requires
original image to retrieve embedded image where in blind watermarking process it
doesn't require cover image to retrieve embedded image.
|
Nidhi Patel, Pinal Suratwala, Rupal Patel, Bhakti Kaneriya, DharaSoliya, Mansi Gandhi,
VivekFumakia
|
Volume-8 Issue-2 - Noise Detection and Reduction in Printed English Text
View PDF Abstract
Printed character recognition has been one of the most interesting and challenging
research areas in field of image processing and pattern recognition in the recent
years. OCR has an improve efficiency and less computational cost, to use database
recognize English character which is managed and very simple. It is digitizing process
by handwritten or printed text that can be scanned electronically and used in. machine
process. Its applying appropriate methods to scanned image and denoised image to
obtain saved for further processing.
|
Ishita Patel, Dhwanil Patel, Foram Patel, Bhumika Chauhan, Rinal Mistry, Khushbu
Patel
|
Volume-8 Issue-2 - Gujarati Language Speech Recognition System for Identifying Smartphone
Operation Commands
View PDF Abstract
Natural Language Processing provides the facility of operating a system with speech
and the system can interact with the user accordingly. Many speech recognition applications
also provide the support for regional languages. In this paper, we would like to
discuss the work that will add some facility to the state-of-the-art facility of
speech recognition of Gujarati language. We have designed and developed a system
that allows the users to give commands to their smartphone in Gujarati language
for some basic facilities like calling, sending SMS, etc. The vocabulary includes
total of 60 words consisting of Gujarati digits, some persons' name to be considered
as a contact name and operational commands which yield the overall average recognition
accuracy of 82.23%.
|
Jigisha K. Patel,Pritesh N. Patel,Paresh V. Virparia
|
Volume-8 Issue-2 - Object Remove Using Structure Oriented In Image Inpainting
View PDF Abstract
Image inpainting is a technique to restore the damage image and fill gape or texts
are written in the image. Image inpainting have different application like removal
of scratches, restoring damage/missing portions of Image. In recent year image inpainting
is high popular in image processing. In this paper we propose the exemplar based
inpainting algorithm with solution of proposed application of Sourceforge.net application
that is built in Java and using exemplar based image inpainting algorithm we remove
the unwanted object and scratches to be contain in image.
|
Harshadkumar Patel, HitenPethani, KalpeshPipaliya, MilankumarKakadiya, RenishUsdadia,
Hiren Patel
|
Volume-8 Issue-2 - Horizontal and Vertical Projection techniques for Line & word
Segmentation Process in Offline Handwritten Gujarati Text
View PDF Abstract
This research paper describes two important techniques Horizontal & Vertical Projection
for character segmentation process in Offline Handwritten Gujarati Text Recognition
process (OHGTR). Segmentation is one of the key steps in Text Recognition process
which is one of the factors in recognition accuracy. This research paper mainly
focuses on segmentation techniques: Horizontal & Vertical projection. Recognition
process is required to perform many preprocessing and post processing steps to recognize
the character. If segmentation of scan documentation is properly happen then rest
of the process will become easy otherwise it will reflect on final recognition process.
It is easy to segment printed character scan document as compare to hand written
document because of the same font size and style where in hand written text document
is , the content character size is different, even the same document is written
by the same person.
|
Ashwin R. Dobariya, V. R. Rathod
|
Volume-8 Issue-1 - An Enhanced Approach towards Tourism Recommendation System with
Hybrid Filtering and Association
View PDF Abstract
In the tourism recommendation system, the number of users and items is very large.
But traditional recommendation system uses partial information for identifying similar
characteristics of users. Collaborative filtering is the primary approach of any
recommendation system. Content Filtering is used to study the behavior of the users.
Content Filtering and Collaborative Filtering together refers as hybrid filtering.
It provides a recommendation which is easy to understand. It is based on similarities
of user opinions like rating or likes and dislikes. So the recommendation provided
by only collaborative and content filtering cannot be considered as quality recommendation.
Recommendation after association rule mining is having high support and confidence
level. So that it will be considered as strong recommendation. The hybridization
of both collaborative filtering with content filtering and association rule mining
can produce strong and quality recommendation even when sufficient data are not
available. This paper combines recommendation for tourism application by using a
hybridization of traditional collaborative filtering technique and data miningtechniques.
|
Ms. Monali Gandhi
|
Volume-8 Issue-1 - Quality Grading of Rice Grains Using Image Processing
View PDF Abstract
The purpose of this paper is to find the quality of rice grain by image processing
technique. Traditionally the grain type and quality was measured through visual
assessment by human inspectors, whose decisions was affected by exterior influences
such as tiredness and was also time consuming whereas machine vision is an automated,
cost-effective and non-destructive technique. This technique begins by acquiring
images of rice samples which are converted to gray image and then to binary image
by using thresholding. The rice grains area is measured using regionprops property
of Matlab.
|
Ms. Dhara Desai, Mr.NikunjGamit
|
Volume-8 Issue-1 - Face Detection on a parallel platform using CUDA technology
View PDF Abstract
Face Detection finds an application in various fields in today's world. However
CPU single thread implementation of face detection consumes lot of time, and despite
various optimization techniques, it performs poorly at real time. With the advent
of General Purpose GPU (GPGPU) and growing support for parallel programming language
like CUDA, it has become possible to use GPU for such computational tasks. Our design
model combines conventional programming using CPU with GP-GPU programming using
NVIDIA CUDA for fast face detection. The face detection algorithm can be paralyzed
to implement on the GPU using CUDA technology. Viola Jones face detector is one
such application which when implemented on GPU, by exploiting some of the parallel
aspects of algorithm, can give optimized output. The proposed method includes the
enhanced Haar-like features and uses Cascade for training and classification which
can efficiently and effectively be implemented on GPU.
|
Ms. Raksha Patel, Ms. IshaVajani
|
Volume-8 Issue-1 - Current Trends in Professional Educational Institutions in India
View PDF Abstract
Indian education system has grown remarkably in terms student enrollment, number
of universities, institutions and percentage of educational expenditure since last
20 years and has become the largest education system in the world. Professional
institutions have also increased at the exponential rate. In this paper, the growth
in higher and professional education institutions has been described. Various government
institutions monitoring the quality of educational institutions and problems faced
by professional education have also been discussed.
|
Dr. (Mrs.) SonaKanungo
|
Volume-8 Issue-1 - A Comparative Study of Methods for Feature Extraction in Opinion
Mining
View PDF Abstract
Sentiment analysis or Opinion mining aims at determining what other people think,
comment and gives their opinion. Opinion can be positive or negative. Feature extraction
in sentiment analysis is now becoming an active area of research. Features can be
implicit or explicit feature. For finding implicit or explicit features different
techniques and algorithms are used. In this work we are comparing such different
techniques.
|
Ms. Rudri Mehta, Ms. Anish Shaikh, Ms. Namrata Patel, Ms. Ravija Mehta, Ms. Devangshi
Patel, Ms. Himadri Patel
|
Volume-8 Issue-1 - Mining Search Log for Keyword Stuffing
View PDF Abstract
The core fundamental concept behind keyword stuffing is to avail with combination
of relevant and irrelevant keyword where the semantics of keywords are not matching
which ultimately leads to poor ranking of website. Search engine spider interprets
and detects places in website where the keywords are stuffed whereby considers it
as a negative design element. Websites should be designed with high quality, well-written
content included with keyword stuffing in limited region of website. Even though
keyword stuffing is unlikely to lead to search engine penalties but if penalized
then webmasters can recovered, it could deter human visitors and reduce website
value.
|
Ms. BhoomikaAhir, Ms. UrvashiBhalala, Mr.AzmeenaPathan, Ms. Pinal Gangani,Ms. Suhani
Patel, Ms. PoonamGodhwani
|
Volume-8 Issue-1 - Actively Hierarchical Load Balancing Algorithm in Cloud Computing
View PDF Abstract
Cloud computing is emerging as a new paradigm for manipulating, configuring, and
accessing large scale distributed computing applications over the network. Clouds
are high-configured infrastructure delivers platform, software as service, which
helps customers to make subscription for their requirements under the pay as you
go model. There are many challenges in cloud computing which are security, load
balancing and theft of losing of data. As the customers are paying for service load
balancing is one of the major challenges for which is required to distribute the
workload evenly across all the nodes. There are many algorithms in cloud which do
load balancing but, they have limitations like instability of system, low performance,
insecure. The proposed algorithm enables scalability, over- provisioning, and minimizing
resource consumption and avoids bottlenecks.
|
Mr. KishanRakholiya, Mr. Jainam Shah, Mr. PradipTalaviya, Mr. ChintanSabhadiya,
Ms. KhushbooBhagavagar, Mr. Vishal Kumbhar, Mr. SavanRaithatha
|
Volume-6 Issue-1 - Detecting Of Software Bugs In Source Code Using Data Mining Approach
View PDF Abstract
: In a large software system knowing which files are most likely to be fault-prone
is valuable information for project managers. They can use such information in prioritizing
software testing and allocating resources accordingly. However, our experience shows
that it is difficult to collect and analyze fine grained test defects in a large
and complex software system. On the other hand, previous research has shown that
companies can safely use cross-company data with nearest neighbor sampling to predict
their defects in case they are unable to collect local data. In this paper the discussion
is done to predict software bugs in the source code by using data mining approach
by training the models that are perfect and that are defect. In our experiments
we used ranking method (RM) as well as nearest neighbor sampling for constructing
method level defect predictors. Our results suggest that, for the analyzed projects,
at least 70% of the defects can be detected by inspecting only (i) 4% of the code
using a NaÄ‚ĂË€žĂËÂÄ‚Ë‚¬ĹˇĂ„ąÂ»ve model, (ii) 6% of the code using RM framework.
|
A. Pravin , Dr.S. Srinivasan
|
Volume-6 Issue-1 - Side lobe level reduction of linear array using GA
View PDF Abstract
This paper presents a novel optimization technique using genetic algorithms for
antenna array synthesis. GA is a global optimization technique which is proved very
effective for optimization based on the science. It is capable of solving linear
and non-linear problems. In this paper, genetic algorithm is used to determine an
optimum set of amplitudes of antenna elements that provide a radiation pattern with
maximum side lobe level reduction.
|
V. Shreni, S.G.Kerhalkar, P. Raikwar
|
Volume-6 Issue-1 - Performance Analysis of Space Time Block Coded Spatial Modulation
(STBC_SM) under dual diversity Condition
View PDF Abstract
STBC_SM is a technique which uses the fundamental of both Spatial modulation and
Space time block codes to provide a high rate of spectral efficiency as well as
transmit diversity. In this paper we have evaluated the performance of STBC_SM under
varying antenna conditions so that to study the effect of receive diversity on STBC_SM.
The overall performance is compared under different conventional diversity technique.
|
S. Dubey, A Kulshreshtha,V Shrivastava
|
Volume-6 Issue-1 - Macros Development For Reducing Program Development And Maintenance
Time In Clinical Reporting
View PDF Abstract
: The article is mainly describe the purpose of the macro facility and efficiency
of macro-based applications in clinical domain for create, tests, and provides resolution
to bugs, defects and other changes to the SAS macro library.It mainly focus on fundamental
concept of program flow, tokenization, %INCLUDE statement, %LET statement, macro
triggers, macro statements, macro variables, global and local symbol tables, automatic
macro variables, macro variable reference, substitution within a macro statement,
substitution within a SAS literal, unresolved reference, substitution within SAS
code, referencing macro variables, combining macro variables with text.It will give
the brief idea about macro functions, defining a macro, macro compilation, calling
a macro ,macro storage, macro parameters, SYMPUT routine, creating a series of macro
variables, creating macro variables in SQL, the need for macro-level programming,
conditional processing, monitoring macro execution , macro syntax errors, parameter
validation, developing macro-based applications, iterative processing, the SYMPUTX
routine, rules for creating and updating variables, rules for resolving variables,
multiple local tables. This article shows complete concept of the macro system.
|
V. Kumar Kunithala, S. Manoj kumar, R. Bojja, U Bharam, P. Puvvula, K. Ahamed, M.
Ranga
|
Volume-6 Issue-1 - Software Quality Improvement by Documentation Ă„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë€šÂ¬ÄąĂË€ş
Knowledge Management Model
View PDF Abstract
This paper addresses an essential and significant concern in the development of
computer software - its quality. The major proposal of the paper is that computer
science faculty, in their design and implementation of core curriculum; do not devote
sufficient attention to teaching their students how to develop high-quality software.
As in industry, the most common and popular way of assuring the quality of programs
is through software testing. In other words, quality is treated as a late addition
or as postscript in software development. The paper presents and discusses a software
quality improvement by documentation Ă„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë€šÂ¬ÄąĂË€ş knowledge
management model that can be used to incorporate a wide variety of quality assurance
techniques within a curriculum. The specific focus of this paper is to uncover various
types of errors done by students during software development, which is been carried
out as a part of their master degree programme.
|
V. Chomal, Dr. J. Saini
|
Volume-6 Issue-1 - E-Green Revolution Through Knowledge Management In Agriculture
Sector
View PDF Abstract
The Agriculture Sector is the backbone of the economic development of the country
India. Several electronic applications in the field of agriculture help the farmer
community to get in touch globally as well as locally for the entire knowledge-based
information regarding the agriculture system. Knowledge Management has been imparting
a lot in the sector of agriculture, which surely leads towards the e-Green Revolution.
In this paper the authors have tried to show such applications and their impact
on the agriculture community.
|
Tejas Ghadiyali, Dr. Kalpesh Lad
|
Volume-6 Issue-1 - Internet Marketing: Comparative Analysis of Search Engine Optimization
Applications on various Parameters
View PDF Abstract
There are presently 2 billion users on the internet which is approximately 28% of
the global human population. According to Pew Research, people are depending on
more and more on Internet search. Subsequently search engines are the main sources
of traffic.[1] More traffic to a commercial website means more visitors, more clients,
more deals and so great profit in industry. One ought to consider using SEO (Search
Engine Optimization) to raise the website traffic. Search Engine Optimization (SEO)
is the practice of designing or updating websites with the objective of gaining
top rankings in search engines for a preferred set of keywords applicable to the
websiteĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂĂË€šĂ‚ÂÂÂs target audience.[3]SEO is done by SEO
specialists who use software tools to mechanise repetitive tasks. SEO tools are
an intrinsic part of performing SEO work. Therefore opportunities exists to improve
SEO tools and to offer an improved experience to SEO specialists. Here, the research
of comparative study of features of several chosen SEO tools helps to reveal opportunities
to improve SEO tools and to keep the SEO specialist parallel with the growing technologies
and areas in the field of SEO.
|
Pooja Mistry, Dhaval Mistry, Jikitsha Sheth
|
Volume-6 Issue-2 - Inter vehicle forward collision avoidance in Vehicular Ad-hoc
Networks on highways
View PDF Abstract
Vehicular ad hoc network is a network of vehicles where each vehicle is considered
as node. In Inter Vehicle communication, topology of the network canĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂt
be defined due to vehicleĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂs mobility. Due
to high speed mobility, there is very short time for data transfer. This paper describes
the inter vehicle collision avoidance on highways to ensure that timely warning
text should be delivered to the drivers regarding the accidents. This can be achieved
by taking two main constraints which are distance between two vehicles and speed
of the vehicles.
|
Ashish K. Desai, Parikshit R. Singh, Bhavik P. Mehta, Manish H. Vala
|
Volume-6 Issue-2 - A study on improving Destination-Sequenced Distance-Vector protocol
View PDF Abstract
An ad-hoc network is a collection of mobile nodes forming instance network without
fixed topology. In such network, each node acts as a router and host simultaneously
and can move join in network freely. DSDV is modification of the conventional Bellman-ford
routing algorithm.In DSDV protocol, whenever the topology of the network changes,
a new sequence number is necessary before the network re-arrange. DSDV is adapted
from the conventional routing information protocol to ad-hoc network routing. It
adds a new attribute and sequence number for each routing node and each node is
maintaining the routing information for all known destinations and setsthe different
update according to different node and reduce the bandwidth waste, reduce the power
consumption and prolong life-span of the node using sleep mode.
|
Imran M.Garasiya, Zankhana B.Naik,Preeti P. Bhatt
|
Volume-6 Issue-2 - Regression Kriging: Disguised and Unveiled
View PDF Abstract
In geographical information systems (GIS) people work with terrains which are a
type of surface. There is another type of surface, which might not be physically
present but can be visualized as the land surface known as statistical surface.
Statistical surfaces include precipitation, snow accumulation, water table and population
density etc.. Spatial prediction of these surfaces is the process of estimating
the values of a target quantity at unvisited location. When applied to a whole study
area, it is also referred to as spatial interpolation.
Surface and its analysis are important to the field of geosciences. Spatial continuous
data (spatial continuous surfaces) play a significant role in planning, risk assessment
and decision making in environmental management. However, these types of data are
usually not readily available and are often difficult and expensive to acquire.
Geostatistics is used to predict values of a sampled variable over the whole area
of interest, which is referred to as spatial prediction or spatial interpolation.
There are number of spatial interpolation techniques with different assumptions,
which are dedicated to predict or interpolate environment variables of which kriging
and its variant are widely used.
The major emphasis of this paper is on regression kriging a hybrid mix of kriging
that has attractive properties but is not as widely used in geosciences as might
be expected. Therefore, this paper attempts to highlights the status of geostatistical
techniques used in selected GIS and Geostatistical software for prediction of environmental
variables in the field of geosciences
|
Hiral Patel, Femina Patel, Jignesh Prajapati Jaishree Tailor
|
Volume-6 Issue-2 - Intrusion Detection Systemusing Neural Network: Comparison andalgorithm
development
View PDF Abstract
Internet, mobile technologies, Computers became part of day-today life.As rely on
connectivity for computing and sharing of data is mandatory, computers, storage
devices and mobile devices are connected to Internet. The IT installations, confidential
data are susceptible to cyber-attacks. To address the challenging threats, various
security tools like Anti-viruses, Firewalls, Intrusion Detection / Prevention Systems
are being deployed. With new attack methods with fast computing environments, the
security mechanism has to be updated as frequently as possible. False Positives
will occur, if specific rules are enabled to increase security to reduce false negatives.
|
Rakesh Savant, Pratik Nayak, Sandip Lad, Bhoomika Patel
|
Volume-6 Issue-2 - Test strategies for web based applications
View PDF Abstract
Growth of web based applications emphasizes on their better quality. Testing has
been the most neglected phase in web application development as of resource and
time scarcity. But now trend is changing and more emphasis is put down to ensure
reliability of web applications as their failure may lead to disastrous situations.
This paper discusses about the current state of testing strategies used to test
web based applications and their limitations. Also it suggests mutation testing
as an effective method of test case evaluation for web based applications. Finally
it suggests some mutation operators for Java Script Language.
|
Priyanka Gupta, Dr. Nipur Dr. Rakesh Kumar
|
Volume-6 Issue-2 - Enhancing Service ProviderĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂs
Profitability and determining the Frequency of Selection of Web Services for Composition
using Dynamic Programming Approach
View PDF Abstract
Web service is any service that is available over the internet that uses standardized
XML messaging system. Web services are language and platform independent. Web services
are self describing as it publishes a public interface. Web services should be discoverable.
When a web service is created it should be published so that interested parties
can find the service and locate the public interface. The World Wide Web which is
human centric is now migrating to web services that is application centric. The
aggregation of web services to create virtual enterprises is called web service
composition. Web service composition is of several types. The first way of classification
is that web services can be composed statically or dynamically. Secondly the services
can be classified as manual and automatic. For composition of web service the most
appropriate service has to be selected. Web service selection is a process of choosing
the most appropriate service to execute the task. Most of the web service selection
algorithms select the service that is client centric. Services that are meeting
clientĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂs needs are selected. However the client
centric business model considering the requirement of client alone is a failure
model. We propose to select services such that it meets both client and service
provider metrics. Service providers being major stake holders should maximize their
profit and minimize its loss. We have developed a stack called SLAKY that suggests
the service providerĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂs metrics. We use dynamic
programming approach to find appropriate service. Dynamic programming is a multi
stage decision process. The original problem is broken into sub problems which are
handled efficiently from the computational viewpoint.
|
P. Sandhya, Dr. M. Lakshmi
|
Volume-6 Issue-2 - An Efficient Algorithm For The Automatic Construction Of AVL
Tree Without Rotations By Sorting Technique rules
View PDF Abstract
In this paper we propose a new algorithm for AVL Tree with Elimination of rotations
by using sorting Technique. The basic idea of this paper is easy to implement and
understand, to increase the performance by elimination of rotations. This can be
achieved by sorted the given elements, then apply divide and conquer technique to
split the given elements into subsets. Then construct the binary search tree individually
and then merge the trees. In this paper we prove the effectiveness of the AVL tree
performance is achieved by without rotations by using sorting Technique.
|
S. Muthusundari, R.M. Suresh
|
Volume-7 Issue-2 - Do returns of stock markets of India and China have long memory?
View PDF Abstract
The paper examines the existence of long memory in stock markets of India and China
using ARFIMA models. The data set consists of daily return of the stock indices
from January 1, 2009 to June 24, 2014. Long memory tests are carried out in the
returns of these series. The results of ARFIMA model suggest the absence of long
memory in both the stock markets. The absence of long memory in asset returns supports
the weak form market efficiency hypothesis. JEL classification: C22, C50 Keywords:
ARFIMA, long memory
|
Dr. Prashant Joshi
|
Volume-7 Issue-2 - Word Sense Disambiguation Using S-WordNet
View PDF Abstract
Word sense disambiguation is the process of identifying correct sense in context.
Ambiguity of word is a problem in Natural Language Processing. The purpose of word
sense disambiguation is to automatically determine specific meaning of word in specific
context so that computer can identify it. To identify correct sense, Lesk algorithm
is used. This is the overlapped based approach where it does overlapping of words
to find correct sense of word. In Lesk algorithm, when correct meaning of ambiguous
word for given sentence cannot be get because LeskĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂs
approach is very sensitive to the exact words of definitions, so the absence of
a certain word can radically change the results. This paper introduce a methodology
which gives more importance to neighborhood words of the word to be disambiguate
as its meaning is most probably depend on neighborhood words, which Lesk algorithm
does not gives.
|
Jay Patel, Nikunj Vagadiya, Dhaval Dobariya, Hiren Gohel, Tushar Donda, Jikitsha
Sheth
|
Volume-7 Issue-2 - Performance Improvement with Hashing to Solve 0-1 Knapsack Problem
using Genetic Algorithm
View PDF Abstract
Genetic algorithm is use to solve the 0-1 knapsack problem (KP). The genetic algorithm
on which this work is based on uses hashing function to store every feasible solution
into the hash table. The 0-1 knapsack problem is an example of a combinatorial optimization
problem, which seeks to maximize the benefit of objects in a knapsack without exceeding
its capacity. Genetic algorithm is computer algorithm that search for good solution
to a problem from among the large number of possible solutions.
|
Ashok Mistry, Bhavin Kuchhadiya, Hinal Prajapati, Yamini Prajapati, Jikitsha Sheth
|
Volume-7 Issue-2 - AHP Rank Reversal, The impacts of scale, normalization and aggregation
rules
View PDF Abstract
During recent years, existence of rank reversal after adding or deleting any of
alternatives has been the main criticism of AHP. According to the literature, there
are three factors that seemed to play a significant role in AHP rank reversal; 1.
Additive function, 2. Measurement scale and 3. Normalization method. While, many
researches tried to examine the effect of each of these factors individually, there
is a lack of comprehensive approach to investigate effects of all factors together.Therefore,
in this paper, we see all of the factors together. In addition, similarities and
differences in the model behavior in case of adding copy of best and worst alternatives
have been investigated. The findings of this study reveal that some methods better
than others are.For instance, logarithmic scale (for different aggregation rules
and normalization method) will show better performance than other scales (minimum
rank reversal),etc.
|
Mohammad Azadfallah
|
Volume-7 Issue-2 - ICT for Wild-life : A technical review
View PDF Abstract
ICTs are imperative in capturing, processing and distributing the information. The
information used and demanded can be put into consideration of institutional requirements
as per the perspective of the end users which can improve decision making process.
Due to the expansion of ICT technology Ă„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ÄąĂËÂÄ‚Ë‚¬ĹźSensor
NetworksĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ÄąÄĂË€ž has come up as a new research area which
is possible due to miniaturization of components; development of low cost and low
power integrated circuits, MEMS based sensor and efficient wireless communication.
One can also monitor the behavior of wildlife in a different way which was not possible
through traditional means which also helps reducing chances of conflict. This paper
portrays about some approaches that we have made for protection and management of
wildlife. Wildlife management engrosses the application of scientific knowledge
and technical skills for protection, conservation and management of wildlife and
their habitat.
|
Rashmi Pandey, Puja Kadam, Sapan Naik
|
Volume-7 Issue-2 - Applicability of Ordinary Kriging on Educational Datasets
View PDF Abstract
Kriging have been applied in different areas like geosciences, soil sciences, fisheries,
hydrology, pollution, health sector and finance sector.It assumes that the distance
or direction between sample points reflects a spatial correlation that can be used
to explain variation in the surface. Therefore this paper highlights application
of kriging on educational datasets. Results of GTU (Gujarat Technological University)
for JulyĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂ2010 examination of 5 zones and 40
institutes were collected. From the available zones and institute, result for unknown
institute was predicted.Predicted results are significantly near to the actual.
|
Rakesh Rathod, Gaurav Kanet, Nikunj Sardhara, Haresh Solanki, Bhavik Vaghani, Jaishree
Tailor
|
Volume-7 Issue-1 - Efficient mining of High Utility Patterns using Frequent Pattern
Growth Algorithm
View PDF Abstract
Data mining aims at extracting only the useful information from very large databases.
Association Rule Mining (ARM) is a technique that tries to find the frequent itemsets
or closely associated patterns among the existing items from the given database.
Traditional methods of frequent itemset mining, assumes that the data is centralized
and static which impose excessive communication overhead when the data is distributed,
and they waste computational resources when the data is dynamic. To overcome this,
Utility Pattern Mining Algorithm is proposed, in which itemsets are maintained in
a tree based data structure, called as Utility Pattern Tree, which generates the
itemset without examining the entire database, and has minimal communication overhead
when mining with respect to distributed and dynamic databases. Hence, it provides
faster execution, that is reduced time and cost.
|
P. Asha1, Dr.T. Jebarajan2
|
Volume-7 Issue-1 - Automatic Track Creation and Deletion Framework for Face Tracking
View PDF Abstract
The proposed approach consists ofimproving the track management by the creation
and deletion of the track when occlusion or failure occurs. In this approach multiface
tracking can be possible. Track creation and deletion will avoid errorness failure
and improve track management. We improve the accuracy of face detection by using
cascade classifiers. Also the face tracking is improved by using Haar Cascade algorithm.
Haar cascade, very rarely addressed in the literature, is difficult due to object
detector deficiencies or observation models that are insufficient to describe the
full variability of tracked objects and deliver reliable likelihood (tracking) information.
To achieve this, long-term observations from the image and the tracker itself are
collected and processed in a principled way using decision tree algorithm, deciding
on when to add and remove a target to the tracker. Proposed algorithm increases
the performanceconsiderably with respect to state-of-the-art tracking methods not
using long-term observations and HMMs.
|
Renimol T G, Anto Kumar R.P
|
Volume-7 Issue-1 - Tools for Efficient Parallelism in High Performance Computing
(HPC)
View PDF Abstract
In todayĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂs multi-core era, it has become important
for software developers to use parallelism in their applications.TodayĂ„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë‚¬ĹľĂ‹ÂÂÂs
application incorporates threading into applications to model parallelism. It is
important to understand the importance of parallelization problems so that only
correct part of program can be carefully selected for parallelism. Software developer
should have Knowledge of resources for implementing threading in software. In this
paper we discuss various modern tools available to take advantage of parallelism
and to remove problems associated with parallelism by analyzing code.
|
Dr.H. N. Patel, Dr. P. V. Virparia
|
Volume-7 Issue-1 - Intelligent Natural Language Query Processor
View PDF Abstract
Natural Language Processing is an area of research and application that tells how
computers can be used to understand and manipulate natural language text or speech
to do useful things. The objective of research is to provide a Natural Language
Query Interface to a non-technical user who did not aware of sentence structure
of database query language and getting relevant output by generating the Structured
Query Language. The paper presents the framework and query analysis to convert natural
language query into structured query form and generate appropriate response in an
intelligent way. The prototype system Natural Ă„ĂË€šĂ‹ÂÂÂÄ‚ËÂÂÄ‚ËÂÄ‚Ë‚¬ĹˇĂ‚¬ĂËÂÂÄ‚ËÂÄ‚Ë€šÂ¬ÄąĂË€ş
English Language Interface to Database (N-ELIDB) was developed in Java using NetBeans
IDE 6.5 taking relational database SQLYog Enterprise - MySQL GUI v5.16 as backend.
|
Amisha Shingala, Dr. Paresh Virparia
|
Volume-7 Issue-1 - CIElab Based Color Feature Extraction for Maturity Level Grading
of Mango(Mangifera Indica L.)
View PDF Abstract
Mango is considered as fruit of the king. Currently human experts do Mango grading
process manually. Here method for automated grading of Mango fruit based on image
processing is proposed which uses color feature extraction. Proposed method contains
three phases. In first phase pre processing on captured image is performed. Region
property extraction and color feature extraction is performed in second phase. Finally
histogram analysis is performed and Mango is graded in three classes namely Unripe,
Partially Ripe and Ripe. Color feature extraction is performed in L*a*b* color model
using dominant color method. Proposed method is not limited to laboratories and
can be used in the real world as it gives very high accuracy and grade Mango in
real time.
|
S. Naik, Dr. B. C. Patel
|
Volume-7 Issue-1 - A Novel Approach for Gujarati Handwritten Text Lines and Words
Segmentation
View PDF Abstract
Text line segmentation is an important step towards any automatic recognition of
offline text document. The variation in inter-line gaps and skewed or curled text
lines are some of the challenging issues in segmentation of handwritten text document.
Moreover, Indian languages like Gujarati having modifier characters like matras
and diacritical marks, which makes segmentation process more difficult. The selection
of appropriate segmentation strategy is very important step in construction of character
recognition system. This paper describes the process of line segmentation and word
segmentation process in Gujarati text document. We propose technique to segment
Gujarati handwritten text document image into text lines and words based on smearing
technique and connected component method. This entire connected component contains
extracted Gujarati words.
|
J. V. Nasriwala, Dr. B. C. Patel
|
Volume-5 Issue-1 - Invited Article: Teaching and Evaluating the performance in vernacular
(Gujarati) sign Language for the Deaf & Dumb Community
View PDF Abstract
Learning the names of the surrounding objects like table, ball, cat etc. by the
deaf and dumb children of age group of 3 to 10 years is a difficult task as compared
to the normal children. These children have disability in hearing and therefore
they cannot speak. Making them understand about the things around them under such
condition becomes a challenge for the family who owe such child. A blind child,
though, equally challenged, has only one disability. But the Deaf and the dumb child
have the two, which puts the society at large in a challenging state to give them
a status to live like a normal child. To really understand their feelings, one has
to be like them for a moment. Educating the Deaf and Dumb (D&D) child is again more
challenging because the normal method of teaching no longer works. Present work
intends to implement Information Technology (IT) to build and support the teaching
and learning process by incorporating graphics, animations and videos. An evaluation
method helps to ascertain the level of understanding by a child.
|
Dr Naren S. Burade Mr. Arvind. G. Patel
|
Volume-5 Issue-1 - Optimization Of Recent Attacks Using Internet Protocol
View PDF Abstract
The Internet threat monitoring (ITM) systems have been deployed to detect widespread
attacks on the Internet in recent years. However, the effectiveness of ITM systems
critically depends on the confidentiality of the location of their monitors. If
adversaries learn the monitor locations of an ITM system, they can bypass the monitors
and focus on the uncovered IP address space without being detected. In this paper,
we study a new class of attacks, the invisible LOCalization (iLOC) attack. The iLOC
attack can accurately and invisibly localize monitors of ITM systems. In the iLOC
attack, the attacker launches low-rate port-scan traffic, encoded with a selected
pseudo noise code (PN-code), to targeted networks. While the secret PN-code is invisible
to others, the attacker can accurately determine the existence of monitors in the
targeted networks based on whether the PN-code is embedded in the report data queried
from the data center of the ITM system. We formally analyze the impact of various
parameters on attack effectiveness. We implement the iLOC attack and conduct the
performance evaluation on a real-world ITM system to demonstrate the possibility
of such attacks. We also conduct extensive simulations on the iLOC attack using
real-world traces. Our data show that the iLOC attack can accurately identify monitors
while being invisible to ITM systems. Finally, we present a set of guidelines to
counteract the iLOC attack.
|
A. Rengarajan C. Jayakumar R. Sugumar
|
Volume-5 Issue-1 - Dwt Based Reversible Watermarking For Lossless Recovery
View PDF Abstract
A novel method for generic visible watermarking with a capability of lossless video
recovery is proposed. The method is based on the use of deterministic one-to-one
compound mappings of frame pixel values for overlaying a variety of visible watermarks
of arbitrary sizes on cover videos. The compound mappings are proved to be reversible,
which allows for lossless recovery of original video from watermarked video. The
mappings may be adjusted to yield pixel values close to those of desired visible
watermarks. The visible watermarks, the opaque monochrome watermark, are embedded
as applications of the proposed generic approach. A DWT (Discrete Wavelet Transform)
has been proposed to provide effective robustness for lossless recovery of image.
Security protection measures by parameter and mapping randomizations have also been
proposed to deter attackers on illicit image recoveries. Experimental results demonstrating
the effectiveness of the proposed approach are also included.
|
V. Uma, K. Vanishree
|
Volume-5 Issue-1 - Nature of M-Learning Affecting Learning Style
View PDF Abstract
Through this paper we are focusing on how to relate M-Learning with disruptive technology.
It also focuses on the ways by which the drawbacks of traditional way of learning
can be overcome using M-Learning. It also lays emphasis on linking M-Learning with
student centered approach of learning.
Keywords:M-Learning, Disruptive Technology, Student-Centered Learning
(SCL) approach
|
Shubhi Jain, Swati Srivastava, Anupriya Tyagi
|
Volume-5 Issue-1 - Enhanced Data Mining and Decision Tree Techniques for Network
Intrusion Detection System
View PDF Abstract
A Network intrusion detection system (IDS) is a security layer to detect ongoing
intrusive activities in computer networks and the major problem with IDS is that
typically so many alarms are generated as to overwhelm the system operator, many
of these being false alarms. Although smart intrusion and detection strategies are
used to detect any false alarms within the network critical subnets of network infrastructures,
reducing false positives is still a major challenge.
Keywords:Intrusion Alert, False Positive, False Negative, Intrusion
Detection System, Data Mining, Decision Tree Classification, Network Subnets.
|
Nareshkumar D Harale, Dr. B B Mehsram
|
Volume-5 Issue-1 - Business Purpose Multimedia Network
View PDF Abstract
The Next Generation Networking (NGN) is a widely used term that offers a truly converged
packet-based network solution with voice, data and media traffic (along with ISP
connectivity) consolidated on a single, robust platform. NGN platform delivers a
centralized, fully managed corporate grade VoIP solution to meet business challenges
in a cost effective manner with a clear return on investment. Next Generation Networking
(NGN) has some key architectural evolutions in telecommunication core and access
networks that will be deployed over the next 5-10 years. The general idea behind
NGN is that one network transports all information and services (voice, data, and
all sorts of media such as video) by encapsulating these into packets, like it is
on the Internet. NGNs are commonly built around the Internet Protocol, and therefore
the term "all-IP" is also sometimes used to describe the transformation towards
NGN.
Keywords:Next Generation Networking (NGN), Telecommunication, VoIP
|
Patel Sumit.
|
Volume-5 Issue-1 - Mobile cum web based voting system
View PDF Abstract
Elections allow the populace to choose their representatives and express their preferences
for how they will be governed. Naturally, the integrity of the election process
is fundamental to the integrity of democracy itself. The election system must be
sufficiently robust to withstand a variety of fraudulent behaviors and must be sufficiently
transparent and comprehensible that voters and candidates can accept the results
of an election. Unsurprisingly, history is littered with examples of elections being
manipulated in order to influence their outcome.
Keywords:Database Systems; Electronic voting; Mobile devices; Web-based
application; Wireless network interfacing; XML data representation
|
Shailee Kumar, Reshma Kapadnis, Gaurav Barokar, Rohit Patil
|
Volume-5 Issue-1 - Role of Enterprise Resources Planning Implementation in Small
and Medium-Sized Enterprises
View PDF Abstract
Information Technology is becoming more and more important for companies, permeating
everything within an organization such as flows, processes, information, strategic
decisions and day-to-day work. Therefore, ERP becomes increasingly important to
save resources and integration departments. This research paper addressed an overview
of ERP and attempted to gain an in depth understanding of ERP adoption in SMEs through
act survey on literature review. The authors attempt to show how the impact of ERP
systems in SMEs, which have limited resources that impose a constraint on their
ability to success implementation of ERP-software.
Keywords:ERP, SME
|
Dr. S.Y. Patil, Belal Saleh Mareai
|
Volume-5 Issue-1 - Network Simulator for Efficient Performance Parameter Testing
& Evaluation
View PDF Abstract
In this paper we present the analysis and performance evaluation of ns2 to facilitate
the simulation and upcoming wired & wireless networks. We have briefly explained
in the table the comparison between widely used network simulators. Here we have
presented the framework for ns2 to facilitate the simulation and also explained
the implementation steps. We have shown the performance evaluation matrix. We have
also shown the steps to bring improvement in the specified factors affecting the
performance of network.
Keywords:Network Simulator, Network Tools, Performance Evaluation,
Throughput, Packet Loss, Retransmission, Queue Type and Queue Size.
|
Dr. Atul M Gosai, Bhargavi H Goswami
|
Volume-5 Issue-2 - Development of Software Dispatcher Based Load Balancing Algorithms
for Heterogeneous Cluster Based Web Systems
View PDF Abstract
Once the web site becomes a popular and the access frequency from a large domain
of user increase then single web server may not be able to handle high volume of
incoming traffic. To provide a better service to all the clients, they need a fully
replicated web server clusters. In such an environment, one of the most important
issues is that of server selection (and load balancing). In web cluster, load balancing
process done at Open Systems Interconnection layer 4 (Hardware Load Balancer) and
layer 7 (Software Load Balancer) with typical static and dynamic load balancing
approaches. In this paper we present the performance analysis of round robin, random,
least connection and new execution time load balancing algorithms at layer 7 in
heterogeneous web cluster. The main purpose of this paper is to help in design of
new algorithms in future by studying the behavior of various static and dynamic
load balancing algorithms.
Keywords:Web Cluster, load balancing, Static load balancing, Dynamic
load balancing.
|
Prof. Gautam J. Kamani, Dr. N. N. Jani, Dr. P. V. Virparia
|
Volume-5 Issue-2 - Efficient Load balancing techniques for VoIP applications
View PDF Abstract
VoIP applications require more reliable quality of service guaranteed packet switching
techniques. Load balancing is an efficient optimization technique to improve QoS.
It moves the traffic from congested links to alternative paths in the network. To
perform load balancing and improve link utilization Efficient Load balancing techniques
for VoIP applications is proposed for Multiprotocol Label Switching networks. The
proposed MPLSMR algorithm classifies the flows and finds the multiple paths for
a source and destination pair. To utilize all the available paths efficiently, this
algorithm first finds an intermediate node k, from the intermediate node k multiple
paths which satisfy the given QoS constraints are discovered using grouping based
multipath selection algorithm. The incoming flow is splitted into these paths and
the packets are dispersed in weighted round robin fashion.
Keywords:MPLS, VoIP, Load balancing, Multipath routing, QoS, traffic
Split.
|
J. Faritha Banu, V.Ramachandran
|
Volume-5 Issue-2 - Analysis and classification of altered fingerprints using FSDCA
View PDF Abstract
In this paper we are analyzing the altered fingerprints and detecting fingerprints
using the FSDCA. There are three types of altered fingerprints that are obliteration,
distortion and imitations. Recent problems in the pattern recognition are alteration.
The criminals can easily evade from their identification by altering their finger.
To overcome this problem we have proposed this method
Keywords:FSDCA, Altered fingerprint, Obliteration, Distortion,
Imitations
|
Josphineleela. R, M. Ramakrishnan
|
Volume-5 Issue-2 - Multiple Feature Extraction for Foot Print Image
View PDF Abstract
In this paper we introduce the new biometric of FOOT PRINT RECOGNITION SYSTEM. This
foot image is proved to be distinct for every human being. On the image of the footprint
obtained we perform pre-processing. Next we perform the vital step of feature extraction.
The best part of this technique is that we use multiple feature extraction techniques.
This feature from the foot image is extracted, classified and then recognized. The
use of multiple feature extraction will provide us with better accuracy.
Keywords:Footprint, Gabor Filter, Wavelet, FNN, SVM
|
V. D. Ambeth Kumar, Dr. M. Ramakrishnan
|
Volume-5 Issue-2 - Business Purpose Multimedia Network
View PDF Abstract
The Next Generation Networking (NGN) is a widely used term that offers a truly converged
packet-based network solution with voice, data and media traffic (along with ISP
connectivity) consolidated on a single, robust platform. NGN platform delivers a
centralized, fully managed corporate grade VoIP solution to meet business challenges
in a cost effective manner with a clear return on investment. Next Generation Networking
(NGN) has some key architectural evolutions in telecommunication core and access
networks that will be deployed over the next 5-10 years. The general idea behind
NGN is that one network transports all information and services (voice, data, and
all sorts of media such as video) by encapsulating these into packets, like it is
on the Internet. NGNs are commonly built around the Internet Protocol, and therefore
the term "all-IP" is also sometimes used to describe the transformation towards
NGN.
Keywords:Next Generation Networking (NGN), Telecommunication, VoIP
|
Patel Sumit. Rathod Paresh, Jain Prince
|
Volume-5 Issue-2 - Cognitive Ontology Enrichment For Semantic Information Retrieval
View PDF Abstract
Information is Knowledge: Knowledge is Wealth. The combinations of concept with
its associative relation, in a quantitative sense is said to be valid information.
Relevant information retrieval through cognitive process is our main objective of
this paper. The specification of Concept with its relation in an organized way is
said to be ontology. Here we use Cognition based ontology for information retrieval
which can be implemented in a semantic based information retrieval system.
Keywords:Ontology, NLP, Syntactic Analysis, Semantic Analysis,
OWL, Machine Learning
|
G. Nagarajan, K.K.Thyagharajan
|
Volume-4 Issue-2 - Invited Article: Productivity Improvement Tool: Code Generator
View PDF Abstract
Code Generator is window based tool for generating code for the .NET application
following MVC kind of architecture i.e. application having Presentation layer, Business
layer, Data layer where data layer is calling the stored procedure in the SQL Server.
|
Mr. Nikunj Patel
|
Volume-4 Issue-2 - Expert System Implemented for Problem Solving in Commerce/Business
Domain
View PDF Abstract
The application of Expert Systems in commerce/business domain is relatively new
research area. The rapid growth of artificial intelligence has lead to the development
and implementation of expert systems for the purpose of commerce/business problem
solving. This generated a need for analysis and review of Expert Systems in commerce/business
domain. Here a few expert systems already implemented in the area of resource allocation/
space utilization are reviewed.
Keywords:Expert systems, Explanation facility, Inference engine,
Knowledge update facility, Resources allocation.
|
Viral Nagori, Dr. Bhushan Trivedi
|
Volume-4 Issue-2 - Parallelism - Trends to be Studied and Misconceptions to be Eliminated
for Future High Performance System
View PDF Abstract
There are two approaches to address the solution, Hardware approach and software
approach. The hardware approach toward multicore processors is influenced largely
by technology limitations of wire delays, power efficiency and difficulty in getting
Instruction Level parallelism. The evolution in software approach has lead us toward
cloud computing. Today is the time where we have to think of multicore processors
in emerging evolutionary concept of cloud. In this paper the misconceptions associated
with the issue that the software is driven by hardware, Multicore processor everywhere
means performance and multicore means the parallelism and hence the optimum speed,
are raised. The Trends which are the driving force toward these misconceptions are
addressed and a justifiable view of future system is presented.
Keywords:Instruction Level parallelism, Cloud computing, Hardware
and Software trends.
|
D. H. Ahir, Dr. Nikesh Shah
|
Volume-4 Issue-2 - Comparative Study of Web Search Engines and User-Centric Search
Engine
View PDF Abstract
Searches of the entire World Wide Web using search engines such as Google, Yahoo!,
Bing, and Ask have become an extremely common way of locating information. Search
engines are providing great facilities to the Internet users to search intended
information from hundreds of millions of Web pages within a part of second. Today
search engines are become more powerful and efficient by using various algorithms
and technologies to provide a best result which demanded by user. Search engines
now provide various added services too. One more area where search engine can also
improve by keeping the track of user activity and history of visited sites which
help user to carry their previous visited sites among different Web browsers, now
a day it is done by Web browser only. This paper presents critical comparison of
various popular search engines based on added features. A detailed analysis is presented
and results are provided.
Keywords:Search Engine Evaluation, Search Engine Statistics, Feature
Comparison, Search Engine Evaluation, World Wide Web.
|
Mubashshirahbanu Shekh, Vikramsinh Sisodiya, Ms. Jikitsha Sheth, Dr. Kalpesh Lad
|
Volume-4 Issue-2 - QFD and Data Mining: Analysis and Incorporation
View PDF Abstract
In today's fast-paced business environment, with floods of data available, decisionmaking
has become a complex task. These data contains nuggets of valuable information in
hidden form, which are often not effectively utilized due to lack of suitable analytic
tools and techniques. Data Mining is a buzzword for the present era. Data Mining
is the non-trivial process of identifying the valid, novel, potentially useful and
ultimately understandable patterns in data. However, with the advent of some technology
like Data Mining, the data can now be suitably analyzed and mined to yield valuable
outcomes. Quality Function Deployment (QFD) is an extensive customer oriented product
development process that strives for improving quality and gaining higher customer
satisfaction.
Keywords:QFD, Data Mining, Data, Product Quality, Voice of customer
|
Ashish .K. Sharma, Dr. Jitendra .R. Sharma, Sangita A. Sharma, Pankaj S. Agrawal
|
Volume-4 Issue-2 - Problems and Challenges in Wireless Network Intrusion Detection
View PDF Abstract
Wireless ad hoc sensor network becomes popular in civil and military jobs. But security
is one of the significant challenges for sensor network because of their deployment
in open and unprotected environment. As cryptographic mechanism is not enough to
protect sensor network from external attacks, intrusion detection system needs to
be introduced. Though intrusion prevention mechanism is one of the major and efficient
methods against attacks, but there might be some attacks for which prevention method
is not known. Besides preventing the system from some known attacks, intrusion detection
system gather necessary information related to attack technique and help in the
development of intrusion prevention system.
Keywords:WSN, IDS, Hierarchical Design, Security, Sensor Node,
Cluster Node, Regional Node, Base Station
|
Ms. Ami Desai, Ms. Hiral Prajapati, Mr. Dharmendra Bhatti
|
Volume-4 Issue-2 - Trust Evaluation Model for Mobile Ad Hoc Network
View PDF Abstract
Securing Mobile ad hoc net work is a challenging task due to the lack of trust among
nodes. In this paper we analysis various trust models for Mobile ad hoc network
and then propose a trust evaluation model for mobile ad hoc network which estimating
trust level of supplicant nodes by Evaluating and analysis node behavior. This model
also designed to detect compromised nodes inside the environment and to isolating
the compromised and misbehavior nodes from the network. This model also tested through
introducing misbehavior nodes and compromised nodes into the environment and the
result shows that the model is highly effective in detecting misbehavior node and
compromised nodes over various other security models for Mobile ad hoc network.
Keywords:MANET, Trust, Node Misbehavior, Compromised Node.
|
M.B. Mukesh Krishnan, Prof. Dr. P. Sheik Abdul Khader
|
Volume-4 Issue-2 - Inclusive Analysis of Mobile OS Features, Capabilities, Performance
View PDF Abstract
Mobile phones have made a bigger difference to the lives of people, more quickly,
than any previous communications technology. They have spread the fastest and proved
the easiest and cheapest to adopt. It is estimated that more than 5 billion people
currently have mobile phones and more than 6 billion will have them in 2013. Mobile
phones have already started functioning as more than just communications devices.
Mobiles serve various functionalities like watch, alarm, calculator, calendar, reminder,
games etc. Smartphones are fastest and becoming a feasible alternative to PDAs and
laptops, offering phone features such as voice and SMS coupled with mobile internet
applications, multimedia functionality, high speed data processing capabilities,
and inbuilt GPS capabilities.
Keywords:Mobile Operating System, Android, Windows OS, i-Phone
OS, Feature of OS, Security in OS.
|
Mr. Hiren H. Patel, Ms. Kamini P. Patel, Mr. Jitendra Nasriwala, Mr. Jitendra Upadhyay
|
Volume-4 Issue-2 - Agile Software Developmrnt Methodology and its Cost Estimation
Technique
View PDF Abstract
This study bridges the gap between the software Industry and Academia through improvisation
of some technique into one methodology. The main objective of this article is three-fold:
1) to focus upon development practices used by small software firm. 2) to study
the documentation practices in small software firm and suggest one type of documentation
practice which will be beneficial for the current practice 3) to find the solution
of problem faced by the small firm while materializing the requirement story board
drawn by the client into cost estimation. Together with these a simple and easy
to use empirical formula for the evaluation of the cost for the Agile development
of a medium-size software has been proposed from the past resultant data of development.
Keywords:Agile, SDLC, Software Development, Extreme Programming,
SCRUM
|
Dr. Utpal Roy, Susmita Das
|
Volume-4 Issue-1 - Sequential Test for the Parameter of Generalized Maxwell Distribution
View PDF Abstract
Sequential probability ratio test is developed for testing the hypothesis regarding
the parameter of a Generalized Maxwell distribution. The expressions for the operating
characteristics (OC) and average sample number (ASN) functions are derived. For
the purpose of plotting the OC and ASN functions different approaches are used.
Keywords:Generalized Maxwell distribution, SPRT, OC and ASN functions,
Newton-Raphson method.
|
Surinder Kumar Naresh Chandra
|
Volume-4 Issue-1 - Analytical Survey of Localisation Strategies in Wireless Sensor
Networks and Their Applicability in WSN Applications
View PDF Abstract
The use of state of art wireless sensor networks in various day to day problems
require spatio-temporal distribution of sensed data to get the clear vision of the
data and its pattern. As large no of sensor nodes are broadcasted in a certain area,
locating their position posses a challenge to us. This paper discusses various ways
to find the internode distances by measuring communicational parameters like RSS,
AoA, ToA etc. These parametric measurements finally lead to the plotting of network.
While dealing with the topological challenges the limitations of all the deterministic
and probabilistic methods are discussed here. Finally the suitability of various
techniques in different WSN applications is presented.
Keywords:wireless sensor networks, localization, spatio temporal
distribution.
|
Kamlendukumar Pandey , Dr. S. V. Patel
|
Volume-4 Issue-1 - E-Business Conflict Resolution: The Role of XBRL, The Next-Generation
Digital Language of Business Reporting
View PDF Abstract
Conflict Resolution (an intervention aimed at alleviating or eliminating electronic
discord through conciliation) as a business theme is more important than ever in
today's fast paced world of e-business and globalization. The significance of an
efficient financial reporting and the Internet in human lives cannot be understated.
Investors and users of the Internet need accurate and reliable financial information
that can be delivered promptly to help them make informed financial decisions. The
Internet embodies the importance of technology, its role in resolving disputes,
and its impact on the increasing globalization of business, information, and culture.
XBRL means Extensible Business Reporting Language. It is an open specification which
uses Extensible Markup Language (XML) based data tags to describe financial statements
for both public and private companies. It uses accepted financial reporting standards
and practices to exchange financial statements across all software and technologies,
including the Internet. It is a twenty-first century digital business reporting
language which allows software vendors, programmers, intermediaries in the information
preparation and distribution process and end users who adopt it as a specification
to enhance the creation, exchange, and comparison of business reporting information.
This paper, using the secondary data methodology approach suitable for large expository
research, presents the advent of the XBRL technology and framework, its history
and why it is essential. It reveals the relationship between XBRL and emerging e-standards,
highlights the XBRL as a vital support for e-business, and showcases the largest
XBRL e-government project in the world. This research finds that XBRL streamlines
the financial information supply chain that includes public and private companies,
the accounting profession, data aggregators, the investment community and all other
users of financial statements. Findings also show that XBRL offers several key benefits
like technology independence, full interoperability, efficient preparation of financial
statements and reliable extraction of financial information. This work recommends
that organizations/investors in every industry, regulatory bodies, professional
associations, government, and educational institutions must embrace the XBRL in
order not to suffer obsolescence and uncompetitiveness.
Keywords:Conflict Resolution, XBRL, data aggregators, secondary
data methodology, XML, distribution process.
|
Faboyede Olusola Samuel
|
Volume-4 Issue-1 - An Effective Approach Using BI Techniques to Analyze Process
Control Parameters During Steel Products Manufacturing
View PDF Abstract
Analysis is a process of understanding data more precisely. The process of analysis
differs according to the data we are trying to analyze. Typical steel product manufacturing
involves many processes running 24*7 in sequence comprising of different process
units. These process units have sensors generating useful data for process control.
These parameters have to be analyzed and monitored in real time to get a good quality
product.
The conventional methods to analyze these data are inefficient. Hence, in this paper
we present an approach for analysis that combines various process control and quality
control parameters, uses online analytical processing (OLAP) system and provides
flexibility to the user for necessary analysis.
Keywords:Data Warehouse, Online Analytical Processing, Statistical
Process Control, Statistical Quality Control, Cube.
|
Veena N. Jokhakar , Dr. S. V. Patel
|
Volume-4 Issue-1 - Neuro-Fuzzy Advisory System for Banks with Type 2 Fuzzy Approach
View PDF Abstract
India is rated second largest country in the world for its population. India still
is a developing country and is progressing towards financial stability. Various
different banks exists abide by the rule of Reserve Bank of India. To make India
strengthen economically banks are issuing loans to people, institutes, industries
and countries. Hence there is a need of an effective guiding system for banks that
advice bank authorities how to give loans, what are different plans for loans are
possible, what is the maximum revenue that can be earned from a business loan. The
paper presents an effective advisory system for banks to take such crucial decision.
The proposed system is developed with artificial neural networks and type 2 fuzzy
logic. The structure of artificial neural network, training data, fuzzy membership
functions used and implementation details are also discussed in this paper.
Keywords:Artificial Neural Networks, Type 2 Fuzzy Logic, Neuro-Fuzzy
Systems, Bank Loan.
|
Mr. Jeegar Ashokkumar Trivedi , Dr. Priti Srinivas Sajja
|
Volume-4 Issue-1 - AUREXGEN- A Novel Algorithm to Find Similar Data and Improve
Database Query Response Time
View PDF Abstract
Data retrieval is mainly concerned with exact field value matching. But the relevant
value may be little bit different than the exact value. In this case, we may lose
important data to process. The effectiveness of a retrieval system strongly depends
on the result it retrieves. In such cases, users who have knowledge of regular expression
can pose query in that way. But generally users who operate applications are not
aware of such tricks. e.g. In Student management system, operator is usually of
clerical level. In such situations data retrieval may fail even if data exist in
system. In this paper, this point is taken into consideration and an algorithm is
developed to process and search structured data in order to get all the relevant
information. We deal with real world entity names like Employees, Students, Products,
etc., rather than considering its meanings and synonyms.
Keywords:Misspelled Queries, Information retrieval, Matching Similar
Data, Similarity Algorithms.
|
Payal Pandya , Dhaval Joshi , Dr. S. V. Patel
|
Volume-4 Issue-1 - The Cellular Network - An Emerging Resource for City Planning
View PDF Abstract
Telecommunication companies generate a tremendous amount of data. These data include
call detail data, which describes the calls that traverse the telecommunication
networks, network data, which describes the state of the hardware and software components
in the network, and customer data, which describes the telecommunication customers.
There are several applications where telecommunication data mining can be used to
uncover useful information buried within these data sets. The primary applications
are, to identify telecommunication fraud, to improve marketing effectiveness, and
to identify network faults. The new emerging telecom data mining applications are
industry specific like bank marketing system, city planning system, census data
management system, etc. In this paper we have presented a Telecommunication Data
warehouse Model to discover knowledge regarding mobility patterns and traffic density
in the cellular network for the prediction of road demand that can be helpful to
the city planners to make effective decisions using cellular network usage data.
Keywords:Data Warehouse, Road Traffic Management, Telecommunications.
|
Ms. Roohana Parabia , Dr. Paresh Virparia , Dr. Sanjay Buch
|
Volume-4 Issue-1 - Defining A Database Upgrade Design Methodology
View PDF Abstract
Databases are they key component in any product or Enterprise system. As multiple
versions of these products and systems have been rolled out over the years, a need
has been felt to define a database upgrade methodology to overcome the multiple
issues that ISVs (Independent Software Vendors) face with ad-hoc upgrade strategies.
This paper attempts to summarize the drawbacks of the conventional upgrade approach;
recommend the key components of a database upgrade and define their boundaries,
roles and responsibilities. Finally, the paper also presents recommendations for
changes to the conventional SDLC (Software Development Life Cycle).
Keywords:databases, upgrade, methodology, transformation, product,
software development, SDLC, ISV
|
Nakul Vachhrajani
|
Volume-4 Issue-1 - Challenges in Genetic Algorithm Based Intrusion Detection
View PDF Abstract
Intrusion detection is the technique of detecting malicious traffic on a network
or a device. It is one of the critical network security components against emerging
intrusions techniques and attacks. In this paper we present a survey of different
intrusion detection approaches. Intrusion Detection Systems based on Genetic Algorithm
are currently attracting researchers due to its inherent potential. Intrusion detection
faces various challenges like reliably detect malicious activity and perform efficiently
to cope with the large amount of network traffic. Here we have analyzed the present
research challenges and issues in Genetic Algorithm based intrusion detection. Finally
we carry out our experiments based on our sample Genetic Algorithm using KDD Cup
99 data set. The main contribution of the implementation is the understanding of
challenges in Genetic Algorithm based intrusion detection.
Keywords:Security, Challenges, Genetic Algorithm, Intrusion Detection.
|
Dharmendra G. Bhatti , Dr. P. V. Virparia , Dr. Bankim Patel
|
Volume-4 Issue-1 - Image Segmentation of Handwritten Dates on Bank Cheques
View PDF Abstract
In this paper Author has described different issues of handwritten date segmentation
task briefly along with its techniques. The paper describes a system develop to
segments handwritten date information specifically written on bank checks. A system
uses a newly adopted segmentation-based strategy. In order to achieve high performance
in terms of efficiency and reliability, a knowledge-based module is proposed in
order to segments the date. The interaction between the segmentation and recognition
stages is properly established by using Segmented-Date generation and evaluation
modules. The paper concludes with the current status of the effort made in segmentation
of handwritten dates and future enhancement in the same direction.
Keywords:OCR, Segmentation, Pattern-based grammar.
|
Dr. Manish M. Kayasth
|
Volume-1 Issue-1 - Profiling the adopters and non-adopters of e-purchasing: A genetic
algorithmbased data mining approach
View PDF Abstract
Data mining has been used in this e-commerce for some time already. It has many
applications in this field such as: searching for patterns in transactional data,
preparation of personalization application, etc. With the proliferation of the electronic
commerce e-purchasing has become a daily practice for many purchasing organizations.
To embrace e-purchasing successfully, these organization should identify determinants
that are crucial for its successful adoption. In an effort to identify such determinants,
including organizational readiness, information technology infrastructure, and user
characteristics, we propose a novel generic algorithm-based data mining technique.
The application of the proposed data mining technique to empirical data that were
collected through a mail survey proves to be useful for extracting hidden, but valuable
insights into the successful implementation of e-purchasing.
Keywords: DM , E-Purchasing , E-Commerce , GA
|
Dr. R.N. Satpathy, Dr. J. P. Panda
|
Volume-1 Issue-1 - Grid implementation using service oriented architecture mining
approach
View PDF Abstract
With the advent of Internetworking and evolvement of web technologies, business
needs, especially isolated service requests/components, put a new demand on new
software architecture. Service-oriented Architecture (SOA), as the next generation
software architecture, and through the utilization of the web service, XML and other
related techno logies provides viable working solution to implement dynamic e-business.
Keywords: SOA, grid, service, loose coupling, web services, hot
spot leveling,
|
A. Raghu Nath, D. Radha Rani, B. Sai Kalyan, R. Suresh
|
Volume-1 Issue-1 - A simulation model to predict the progression of diabetics using
non-pathological data
View PDF Abstract
This paper describes a simulation model to predict the progression of diabetics
using non-pathogenic data. Simulation modeling is part of bioinformatics as bioinformatics
is a management information system for molecular biology, computational biology,
population modeling, numerical simulations and has many practical applications.
Here we have used simulation model to diagnose disease and disease risk from the
symptoms of an organism and some factors.
Keywords: Simulation and modeling, diabetes, non-pathological data.
|
Dr. P. V. Virparia, Dr. Hetalkumar Panchal
|
Volume-1 Issue-1 - Consumers' perception about maybelline's product: A case study
of surat city using factor analysis
View PDF Abstract
In this study an attempt has been made to determine the factors which Maybelline
customers keep in mind when they buy their products. We randomly selected 30 respondents
from various parts of Surat city and they were asked to rank ten statements regarding
Maybelline products on a 7-point scale. On the basis of data thus collected and
applying Factor Analysis we arrived at the following conclusions:
It is found that three factors have been extracted from ten variables using Factor
Analysis (data reduction technique) using SPSS 11. All these three factors viz.
Eye Catching Effect', 'Effective Fragrance', 'Campaigning with Care by Maybelline'
or 'Maybelline's Publicity Policy' contributed about 73 percent variation in Consumers'
perception about Maybelline's product in Surat city.
Keywords: Factor Analysis, KMO and Bartlett's Test, Scree plot,
Principle Components
|
M. B. Dave, Gaurang Rami, Trupti Goyani
|
Volume-1 Issue-1 - Forecasting of stock using hybrid system of neural network and
genetic algorithm
View PDF Abstract
Genetic Algorithms and Artificial Neural Networks are soft computing methods inspired
and motivated by the biological computational process. Neural Networks are highly
simplified models of human nervous system which shows ability to adapt circumstances
and learn from past experience at same time. Genetic algorithms are adaptive search
and optimization algorithms, inspired by process of biological evolution. As prediction
of stock market is important issue in finance, Artific ial Neural Networks have
been used in stock market prediction since last decade. The Hybridization of Neural
Network with Genetic Algorithm is performed for the purpose of investigating better
methods of problem solving. Genetic Algorithm is doing optimization of Neural Network.
Here the prediction of stock behavior is done using this Hybrid Approach.
Keywords: GA-Genetic Algorithm; NN-Neural Network; BP-Back Propagation;
ANN-Artificial Neural Network; EMHEfficient Market Hypothesis; BPN-Back Propagation
Network.
|
Nirupama Parmar, Falguni Ranadive
|
Volume-1 Issue-1 - E-Business and information superhighway - The indian perspective
View PDF Abstract
The Indian Market is flooded with the Information Technology Jargons. E-Business
is the Buzzword, which every marketer and aware customer is chanting. However, if
we dig into reality a little further, this proliferation of information technology
seems to be restricted to few privileged parts of the country and still fewer sections
of the public/private sectors of India n market. The real challenge of using Information
Technology for our benefit lies in the fact that ordinary players in the market
is yet to be educated on the functionalities of technology and the hurdles lying
in the path of making technology work for them.
This paper will describe the challenges of e-Business in context to India. The country
which is more than one billion of population, e-Business is still have lot of scope
to grow. This paper examines why India is lagging behind in e-Business when India
is having information technology resources, vast number of well-established business
organizations and a foundation for all sorts of Business-opportunities.
Keywords: e-Business, challenges, benefits, and information superhighway
|
Praveen Kumar, Richa Anand
|
Volume-1 Issue-1 - Parallel min-max: An approach for parallelizing game tree search
View PDF Abstract
Game-playing systems have search engines at the core of the application. Although
single processor machines are becoming faster every year, researchers are always
looking for increased speed, since that will improve the quality of their gameplaying
systems. Faster machines provides deeper searches and a better quality of play-which
can be taken one step further by using a large number of machines in parallel. Parallel
game-playing systems use multiple processors to cooperate in computing the game-tree.
Games like chess, checkers, go needs exponential computations. To get better results
in selecting the steps in these games deeper depths are to be considered. As the
depth of game tree increases the number of computation increases. Single processor
system is not feasible to get the better results. In such cases we need multiple
processors for evaluating the game tree. The whole game tree is distributed on multiple
processors for getting better esults. Parallel MINMAX is a game tree search algorithm
distributed on multiple processors.
Keywords: Game playing, Parallel MIN-MAX, game tree search algorithm
|
Neelam Surti
|
Volume-2 Issue-1 - The Application of Total Quality Management (TQM) in Academics
View PDF Abstract
This paper examines the issue of quality of education in India. The paper recognizes
quality of education as one of the most widely used and spoken of concept in India,
although, very little or no unanimity with regard to its meaning has been reached.
This is essentially true in higher education as compared to industry where clearly
definable products with quantifiable qualities exist. The 'product of higher education
is intangible and the customer very difficult to identify. However, in India today,
there is mounting concern about the state of education at all levels. This is rooted
in the realization that literacy levels and academic achievement will determine
individual's job attainment and earning as well as the general economic well being
of the society. Moreover, the quality of life in the society will be affected by
the level and quality of social skills acquired in educational institutes. It is
in line with this that the paper looks at the various issues bordering on quality
of higher education in India and suggests different options and strategies which
can be used to improve the quality of education, especially, at the university level.
Thus, the following questions will be answered in the paper: How is quality perceived?
Can it be measured? What is the interrelationship between internal and external
assessment procedures? How can educational institutes deal with quality improvement
and development within the context of Total Quality Management (TQM)?
The quality has become integral feature of the education all over the world. We
are continually faced with quality initiatives and controls, not only from central
governments but also from awarding bodies. Often, these quality initiatives are
also incorporated into appraisal schemes and in Total Quality Management (TQM).
This study examines literature relating to TQM, quality assurance, and quality enhancement
and also considers the impact an interpretive approach to quality would have on
the pedagogic practice of tertiary institutions. The role of university management
is to provide the vision and the making of that vision a reality through encouragement
and active participation in quality oriented exercise. This paper will provide a
framework toward achieving this.
Keywords: Total Quality Management, Academia, PDCA Approach
|
Dr. Mehul S. Raval
|
Volume-2 Issue-1 - Software Development Patterns of students: an experience
View PDF Abstract
Agile methodologies are emerging and gaining popularity in industry. To help our
students grow into efficient software developers, in addition to good analysts and
managers, we must instill into them the best methodologies for software development.
Pair programming is a methodology in which two people work together and periodically
switch between the roles of driver and navigator. Instead of partitioning a task
into a number of activities, where each member performs a different activity alone,
in pair work both partners perform each activity together. This paper presents the
results of a study to assess the pattern (which we claim is similar to agile type
methodologies) in software development procedures of student groups in a university
setting. It is seen that vision and talent incorporated with structure and tools
can serve as a good process model. This is to say that, a balance of agile and process
driven approach is reportedly achieving the best performance.
Keywords: Agile method, Empirical approach, Process models, Software
development.
|
Vandana Bhattacherjee, Madhumita S. Neogi, Rupa Mahanti
|
Volume-2 Issue-1 - Developing Mobile Based System with Qtopia SDK in Virtual Operating
System & Linux Kernel - Mobile Technology
View PDF Abstract
The introduction of Linux operating system into the embedded sector has been one
of the most exciting changes in the last few years. Based on the open-source model,
it offers new possibilities to embedded engineers traditionally used to commercial
operating systems. Qt for Embedded Linux (formerly known as Qtopia Core) is the
leading application framework for the developing a (For Region Language - Gujarati)
mobile-application and mobile based kernel system as well as single-purpose devices
powered by embedded Linux. It provides a robust and proven development environment
inherited from the Qt cross-platform application framework and key components developed
specifically for embedded Linux. Qt for Embedded Linux enables manufacturers to
efficiently create devices with applications that are tailored to market needs specifically
Gujarati Language.
Keywords: Qt, Linux, Qtopia, Cross-platform, Gujarati Language,
mobile application, mobile based kernel.
|
Dr. Prashant M. Dolia, Milan S. Bhatt
|
Volume-2 Issue-1 - Informed Embedding and Whitening Filter Correlator Based Spatial
Domain Data Hiding Technique
View PDF Abstract
Data hiding is, mainly related to the possibility of perceiving the presence of
an object by means of human sight. There are many ways, by which perceptual model
can be incorporated into data hiding system to control perceptibility of data. In
this work author implements, model for simple adjustment of a global embedding strength,
based on cover image. Whitening filter is used at decoder which helps to de-correlate
samples of received image, and improve performance of system. Work encompasses Watson
perceptual model.
Keywords: Data Mining, Spatial Data, Data Hiding Technique.
|
Dr. Mehul S. Raval
|
Volume-2 Issue-1 - Planning and Implementation of Knowledge Grid in Indian Context
View PDF Abstract
There is a general opinion that providing internet access to colleges will result
in the delivery of better quality education to greater numbers of students. But
this expectation is not realized in practice. In academic institutions more number
of students are wasting time on accessing email and browsing irrelevant sites. It
reduces the quality of education because most of them are confused by using the
pool of unstructured information on the web. In this context, Knowledge Grid is
a well structured framework that takes inputs from number of domain experts. The
said prototype model in this paper attempts to take up the challenges of appropriate
use of IT infrastructure in the field of education using a knowledge enabling approach.
Keyword: OLE, AGS, Transnet
|
Vinod L. Desai, Dr. Nilesh K. Modi, Dr. V R Rathod
|
Volume-2 Issue-1 - Quantum Computer and Quantum Algorithm for Travelling Salesman
Problem
View PDF Abstract
Depending upon the extraordinary power of Quantum Computing Algorithms various branches
like Quantum Cryptography, Quantum Information Technology, Quantum Teleportation
have emerged [1-4]. It is thought that this power of Quantum Computing Algorithms
can also be successfully applied to many combinatorial optimization problems.
In this article, a class of combinatorial optimization problem is chosen as case
study under Quantum Computing. These problems are widely believed to be unsolvable
in polynomial time. Mostly it provides suboptimal solutions in finite time using
best known classical algorithms. Travelling Salesman Problem (TSP) is one such problem
to be studied here. A great deal of effort has already been devoted towards devising
efficient algorithms that can solve the problem [5-18]. Moreover, the methods of
finding solutions for the TSP with Artificial Neural Network and Genetic Algorithms
[5-8] do not provide the exact solution to the problems for all the cases, excepting
a few. A successful attempt has been made to have a deterministic solution for TSP
by applying the power of Quantum Computing Algorithm.
Keywords: Quantum Computing, Travelling Salesman Problem, Quantum
algorithms.
|
Utpal Roy, Sanchita PalChawdhury, Susmita Nayek
|
Volume-2 Issue-1 - Offline Typed Gujarati Character Recognition
View PDF Abstract
Character recognition is major concern since its inspiration. So, far very limited
progress has been made in it, specifically for Indian languages. In this paper authors
have presented recognition of offline computer generated and printed Gujarati characters.
To identify characters authors used modify version of Hidden Markov Model (HMM)
based algorithm. The system is trained and tested for different font size of Gujarati
characters. Keywords: Offline Typed Characters, Character Recognition, Optical Character
Recognition (OCR), HMM
|
Manish Kayasth, Dr. Bankim Patel
|
Volume-2 Issue-1 - Clustering Approach in Context Free Data Cleaning
View PDF Abstract
In this era of Knowledge, organizations can gain competitive advantage only by proficient
data analysis. This paper emphasizes on application of clustering in context free
data cleaning by correcting values of attributes, using various sequence similarity
metrics, where reference data set is not available, to improve the quality of data
which in turn lead to eminent data analysis. Authors propose an algorithm to examine
suitability of value to correct other values of attributes. Various sequence similarity
metrics were used, to find distance of two values of attributes, to test the data
and generate results. Experimental results show how the approach can effectively
clean the data without reference data.
Keywords: Clustering, Context free data cleaning, Sequence similarity
metrics.
|
Sohil D. Pandya, Dr. Paresh V. Virparia
|
Volume-2 Issue-1 - Autoplus: A Generic Automation test Framework for Testing and
Integration Challenges of Complex/Multilayered Software System
View PDF Abstract
Testing is an equally important and critical phase as analysis and design for successful
deployment of any software. Traditional testing and integration have been manual
and time consuming. Concept of automation of testing process has been around for
a while and has been deployed and successfully used. But as complexity of software
systems began to increase and as organizations preferred to impose more processing
on software, the traditional automation was no longer sufficient. Automation Framework
concept was introduced for the integration and testing of these systems and many
off-the-shelf frameworks were introduced. These frameworks, however, have not been
able to provide the automation solutions as they promise. More often than not, most
organizations have to use these frameworks as just another tool in the plethora
of tools they have purchased and also rely on internal resources to achieve the
desired results. Additionally, most organizations have proprietary protocol implementations
as well as in-house tools that are widely used by developers and system testers
to validate the software.
What we are discussing in this paper is how organizations can efficiently create
an Automation Test Framework (ATF) using the combination of off-the-shelf tools,
in-house tools, General Purpose License (GPL) tools and various high level scripting
languages. The proposed ATF consists of a User Interface, Driver Application, Execution
Engine, Unit Under Test (UUT), Verification Engine and a Reporting Engine. The case
study describes the implementation of such an Automation Framework for a multi-layered
multimedia delivery software system.
Keywords: ATF, Automation, Frameworks, Integration, Testing
|
Amit Kaul, Dr. Priyanka Sharma
|
Volume-2 Issue-1 - A Study on Factors Affecting Consumer Buying Behavior in Apparel
Category in Delhi and NCR region with Special reference to Koutons & Cotton County
Retail Stores
View PDF Abstract
This paper is an attempt to find the various factors which affects customer buying
behavior in apparel category market in India esp. Delhi/NCR. The impact of various
buying factors like sales and promotions, placement of products, window merchandising,
effective price strategy etc on customer buying behavior have been analyzed. Kotler's
black box model has been used as the basis of the current study to understand the
consumer buying behavior in Apparel Industry. The study is based on the primary
data collected from Koutons and Cotton County retail stores from the area of Delhi
and NCR regions with the help of structured questionnaire.
Keywords: consumer buying behaviour, apparel retail, visual merchandising
|
Anagha Shukre, Vaibhav Pratap Singh
|
Volume-2 Issue-1 - Dimensions of Liquidity Management - A Case Study of the Surat
Textile's Traders Co-operative Bank Ltd., Surat
View PDF Abstract
The paper illustrates the concept of liquidity management with reference to The
Surat Textile's Traders Co-Operative Bank Ltd., Surat, using seven parameters viz:
current ratio, liquidity ratio, cash position ratio, short-term investment to current
assets ratio, shortterm advances to current assets ratio and short-term investment
& short-term advances to short-term deposits ratio.
Keywords: Data Mining, Spatial Data, Data Hiding Technique.
|
Dr. Manisha Panwala
|
Volume-2 Issue-2 - A Futuristic Software Framework to Generate Actual Customer needs
for Quality Function Deploymen
View PDF Abstract
Quality Function Deployment (QFD) is a product development process that encompasses
a sheer amount of data gathered from customers through several market research techniques
like personal interview, focus groups, surveys, video conferencing etc. This massive,
unsorted and unstructured data is required to be transformed into a limited number
of structured information to represent the actual 'Customer Needs'. However the
process is tedious and time consuming and cannot be dealt with manually. In order
to address these issues, this paper proposes a futuristic software framework based
on an Affinity Process. The paper begins with the topic introduction and outlines
the QFD process. The paper then describes the Affinity Process, builds the data
structure and then makes an attempt to build the proposed framework using tools
Visual Basic (VB) and MSAccess. The proposed framework is developed as a part of
QFD software and it is anticipated that when completely developed, it would act
as a vital component of QFD software.
Keywords: QFD, Affinity Process, Visual Basic, MS-Access, Software,
Customer Needs.
|
Ashish K. Sharma, I. C. Mehta, Dr. J. R. Sharma, Sangita Sharma
|
Volume-2 Issue-2 - Signature Recognition and Verification using Artificial Neural
Networks: A Comparative Study
View PDF Abstract
The paper presents the comparative analysis of signature recognition and verification
using Neural Networks. Three well-known and widely used Neural Networks viz. Support
Vector Machines, Multilayer Perceptron and Radial Basis Function Network have been
used. Each network differs from the other in the manner it approaches the signature
given for recognition and verification. A signature database is collected using
intrapersonal variations for evaluation. For every 6 training examples, 4 are used
to test the signatures based on various features like false rejection rate, false
acceptance rate, equal error rate, and average error rate. The merits and demerits
of all the approaches are evaluated and hence the results of numerical experiments
are given and analyzed in the paper. In this paper an off-line Recognition and Verification
is done with the objective of performance comparison. The comparison of the three
networks is done with respect to the complexity of the structure as well as the
accuracy of expected results so that the forgeries can be minimized.
Keywords: Neural Networks, offline Signature Verification, Multilayer
Perception, Support Vector Machine, Radial Basis Function Network.
|
Nidhi Arora
|
Volume-2 Issue-2 - Creating Web Unite of Web Communities and Derive Astonishing
Information from Web Unite
View PDF Abstract
The Web harbors a large number of communities - groups of content-creators sharing
a common interest - each of which manifests it self as a set of interlinked Web
pages. New groups and commercial Web directories together contain of the order of
20,000 such communities; our particular interest here is on particular topic based
communities. There is a type of information called Unexpected Information, which
is of great interest. Finding unexpected information is useful in many applications.
For example, it is useful for a company to find unexpected information about its
competitors, e.g., unexpected services and products that its competitors offer.
With this information, the company can learn from its competitors and/or design
counter measures to improve its competitiveness. The research tries to form a group
of common objective web sites and then derive information by comparing those web
sites. The research proposes a methodology through which we can group all those
same type of web sites and can find out some unexpected information from it.
Keywords: Web Unite, Web Mining, Information Extraction
|
Nisarg N. Pathak, Dr. Nilesh K. Modi, Dr. S. M. Shah
|
Volume-2 Issue-2 - A New Model for Congestion Detection in High Speed Network with
High Speed Protocols Creating
View PDF Abstract
Continuously growing needs for Internet applications that transmit massive amount
of data has led to the emergence of high speed networks. The primary factor hindering
the flow of Traffic is Network Congestion. Data transfer must take place without
any congestion. The Internet carries certain critical data and information which
need to be delivered to the receiver at all cost. Thus Congestion Detection plays
a key role in high speed networks. The Traditional TCP has been used over the years
to detect and control Congestion. Various algorithms such as ECN, DECbit, RED Queue
Mechanism are adopted by TCP to detect Congestion. In this paper, a rational mechanism
utilizing XCP protocol is used. Adopting the techniques of a new congestion detection
module with effective feedback mechanism and modified window adjustment is achieved.
A Separate Congestion Detection Module is developed to detect and inform immediate
Congestion Detection and Control. The feedback parameters are calculated based on
arrival rate, service rate, traffic rate and queue size. Resulting in no drastic
decrease in window size, better increase in sending rate because of which there
is a continuous flow of data without congestion. Therefore as a result of this,
there is a maximum increase in throughput, high utilization of the bandwidth and
minimum delay. The result of the proposed work is presented as a graph based on
throughput, delay and window size. Thus in this paper, XCP protocol is well illustrated
and the various parameters for Congestion Detection are thoroughly analyzed and
adequately presented.
Keywords: TCP/IP, Congestion Detection, Window Management, Feedback
Control, Queue Management, Explicit Control Protocol.
|
T. Sheela, Dr. J. Raja
|
Volume-2 Issue-2 - RC-5 Protocol Based Embedded Control System for Home Securities
View PDF Abstract
For better living, it is very important to pay attention to various aspects of life
like securities, safety and comfort. Our home can be made secure in almost all respect
using automatic embedded control system. The system provides security against burglary
and various disaster conditions like fire, flood, cyclones, etc... In which control
system continuously observes different parameters like temperature, smoke, wind
velocity etc... and compares with the standard data fed to the system. If the system
detects any abnormal condition then control systems works out the coordinate of
the abnormality and alert users as well as pass on the information to the base control
system for further action. The system can be constructed using embedded hardware
with appropriate firmware program. The RC-5 protocol can be used to establish communication
link between measurement/primary control systems to base control system.
Keywords: RC-5 Protocol, Embedded system, Base control system(BCS),Central
Control System(CCS) Home security.
|
Hitesh J. Lad, Dr. Vibhuti G. Joshi
|
Volume-2 Issue-2 - Quantitative Trade-off Analysis in Quality Attributes of a Software
Architecture using Bayesian Network Model
View PDF Abstract
Research into design rationale in the past has focused on argumentation-based design
deliberations. These approaches cannot be used to support change impact analysis
effectively because the dependency between design elements and decisions are not
well represented and cannot be quantified. Without such knowledge, designers and
architects cannot easily assess how changing requirements and design decisions may
affect the system. We apply Bayesian Network Model (BNM), to capture the probabilistic
causal relationships between design elements and decisions. We employ three different
BNMbased reasoning methods to analyze the trade-off between the conflicting quality
attributes. Markov blanket discovery algorithms can be used for quality assessment
BNMs. Additionally, work will be done to determine how known optimization methods
such as Tabu search may be applied in the context of the proposed framework. Ultimately,
the goal is to create a possibility of automatic execution of steps involved in
architectural optimization.
Keywords: Bayesian Network Model (BNM), Architectural Tradeoff
Analysis Method (ATAM) Software Architectural Analysis Method (SAAM), Software Architecture
Assessment using Bayesian Networks (SAABNet), Stake Holders Expects(SHE) , and Markov
Blanket (MB).
|
N. Sankar Ram, Paul Rodrigues
|
Volume-2 Issue-2 - Distributed Data Mining for Synthesizing High Frequency Association
Rules : A Case Study for Determining Service Quality in Hospitals
View PDF Abstract
Many large organizations have multiple data sources, while putting all data together
from different sources might amass a huge database for centralized processing. Data
mining involves the exploration and analysis of large amounts of data in order to
discover meaning patterns. Data mining association rules at different data sources
and forwarding the rules to the centralized company headquarter provides a feasible
way to deal with multiple data source problems. However, the forwarded rules from
different data sources may be too many for the centralized company headquarter to
use. Therefore, there is a need to find high frequency rules that have major role
in decision making process.
A weighting method is proposed in this paper for identifying valid rules among the
large number of forwarded rules from different data sources. Valid rules are the
rules which are supported by most of the branches of an organization. Hence this
method is applied to rank the rules based on patient perceived service qualities
in a hospital. Experimental results show that this proposed weighting model is efficient
and effective.
Keywords: Association based data mining , Data reduction, weights,
SERVQUAL scale.
|
Anirban Chakrabarty, Sonal G. Rawat
|
Volume-2 Issue-2 - Implementation of Integrated Development Environment and Compiler
for GPL
View PDF Abstract
Computer programming is important today, simply because technology has taken over
the globe for over a decade now. Several new programming languages are coming at
a regular interval; usually supporting programming in English language. Very limited
worked were available to support programming in regional language. In this paper,
authors have presented development of Programming Language (GPL) that support programming
in regional language Gujarati.
Keywords: Programming Language, Compiler, Integrated Development
Environment, Scanner, Parser, Unicode
|
Kalpesh B. Lad, Dr. Bankim Patel
|
Volume-2 Issue-2 - Indian Ites Industry: Emerging Trends & Challenges in Global
Perspective
View PDF Abstract
This Research article gives brief overview of IT Enabled Services scenario in India.
It explains in brief the meaning of IT Enabled Services and also covers various
services which have been identified as IT Enabled Services. It also throws some
light on opportunities available to the industry. The author has also tried to identify
some of the challenges and roadblocks present in the way of growth of IT Enabled
Services in India.
Keywords: IT enabled services, brand equity, IT infrastructure
|
Dr. Jaydip Chaudhari
|
Volume-2 Issue-2 - Industrial Scope of 2d Packing Problems
View PDF Abstract
Packing problems are optimization problem encountered in many areas of business
and industries and have wide applications. These problems look for good arrangement
of multiple items in some larger containing regions with an objective to maximize
the utilization of resource materials. 2D packing problem has wide industrial applications
starting from small scale industries related to leather, furniture, glass, metal,
and wood to large scale industries dealing with textile, garments, paper, shipbuilding,
automobiles and VLSI design. It has been observed that using automated nesting solutions
based on heuristics prove to be better over conventional methods where very few
intuitive arrangements were tried by experienced craftsmen and in that case final
layouts were dependent on the dexterity of skilled craftsperson. In this paper authors
have summarized the different approaches used to solve 2D packing problem along
with their industrial applications. Accordingly, this study is an academic review
of the industrial applications of 2D packing problem.
Keywords: Packing problem, Trim Loss problem, Rectangle Packing,
Bin Packing, Cutting and Packing.
|
Kawaljeet Singh, Leena Jain
|
Volume-3 Issue-1 - Securing Datawarehouse Using Multidimensional Modeling with Virtual
Private Database
View PDF Abstract
Enterprise data warehouses are often very large systems and serve many user communities.
Data warehouse play a key role in the business intelligence and crucial decision
making. In addition to the normal data warehouse functionalities, they require flexible
and powerful security features. It is expected that the security capabilities must
seamlessly be incorporated in an environment which has stringent performance and
scalability requirements.This paper advocates new concept of security design i.e.
depicting security mechanisms at conceptual design stage itself using multidimensional
modeling. VPD has been selected as the tool to be used in the implementation phase
that satisfies the security mechanism depicted at the design stage.
Keywords: Data warehouse security, multidimensional modeling, conceptual
design, VPD, dimensional level security, attribute level security, row level security.
|
Veena N. Jokhakar, Dr. S.V. Patel
|
Volume-3 Issue-1 - Clustering Based Outlier Detection Method for Network Based Intrusion
Detection
View PDF Abstract
The discovery of objects with exceptional behavior is an outstanding challenge from
a knowledge discovery standpoint and has received considerable attention in many
applications such as network attacks, fraud detection. This paper proposes a simple
clustering based algorithm to detect outlying objects. The main problem for network
intrusion detection system is the ability to exploit ambiguities in the traffic
stream. Network-Based Intrusion Detection monitors network traffic for particular
network segment and analyzes the network and application protocol activity to identify
suspicious activity. There are several recently developed outlier detection schemes
to detect attacks in a network. In this paper, the proposed algorithm is applied
to network intrusion detection system to detect ambiguities or violations in the
network traffic stream.
Keywords: Outlier Detection, Clustering, Network based Intrusion
Detection
|
Deevi Radha Rani
|
Volume-3 Issue-1 - Betterment of Business Communication with Steganography and Cryptography
View PDF Abstract
One of the biggest problems that most companies have regarding security technology
is that they don't develop a strategy for implementation. Many companies seem to
have the attitude that implementing a single technology, such as firewalls, will
make them secure. In reality, just putting a firewall in place is not enough: That
firewall has to be designed and configured properly for it to do its job successfully,
and it must be used in concert with other security measures. And all these security
measures must be properly integrated with the current network.
Secure communication is no different from information security: You have to pick
the right tools, implement them properly, and train users to use them in their daily
work.
This paper will look at the kind of assessment involved in developing a secure communications
strategy that might include stego and crypto.
Keywords: Steganography, Cryptography, Communication, public key
steganography, private key steganography, Steganalysis
|
Divyakant T. Meva, Jaypalsinh A. Gohil, Amit K. Patel
|
Volume-3 Issue-1 - Implementing Advanced Inrusion Detection System by Monitoring
Network Anamalies and using Encrypted Access of Data
View PDF Abstract
One of the biggest problems that most companies have regarding security technology
is that they don't develop a strategy for implementation. Many companies seem to
have the attitude that implementing a single technology, such as firewalls, will
make them secure. In reality, just putting a firewall in place is not enough: That
firewall has to be designed and configured properly for it to do its job successfully,
and it must be used in concert with other security measures. And all these security
measures must be properly integrated with the current network.
Secure communication is no different from information security: You have to pick
the right tools, implement them properly, and train users to use them in their daily
work.
This paper will look at the kind of assessment involved in developing a secure communications
strategy that might include stego and crypto.
Keywords: Steganography, Cryptography, Communication, public key
steganography, private key steganography, Steganalysis
|
J. Arokia Renjit, Dr. K. L. Shunmuganathan
|
Volume-3 Issue-1 - Motion Trajcectory Based Video Content Retrieval and Delivery
for Small Displays
View PDF Abstract
The Telnet, rlogin, rcp, rsh commands have a number of security weakness: all communications
are in clear text and no machine authentication takes place. These commands are
open to eavesdropping and tcp/ip address spoofing. SSH uses public/private key RSA
authentication to check the identity of communicating peer machines, encryption
of all data exchanged (with strong algorithms such as blowfish, 3DES, IDEA etc.).
In this paper we proposed an IDS for encrypted access with SSH2 protocol to network
public servers. Our proposed system detects the intrusions based on transferred
data size and timing, which are available without decryption. The results reveal
that the proposed system work well for different kinds of intrusions. Pre-operations
are not needed and privacy is not violated. The detection is based on anomaly detection,
which relies on the frequency of similar accesses and the characteristics of usual
HTTP accesses.
Keywords: IDS, SSH, SSH2, MD5,MAC
|
P. Geetha, Dr. Vasumathi Narayanan
|
Volume-3 Issue-1 - Channel Sharing Scheme for Cellular Networks Using BDDCA Protocol
in WLAN
View PDF Abstract
Adaptive Multimedia Content Retrieval and Delivery for small displays is one of
the challenges faced by Multimedia Community. Input video is transformed to an output
video by utilizing manipulations at multiple levels (signals, structural or semantics)
to meet diverse resource constraints and user preferences with optimizing overall
utility of the video. The proposed system is developed to display the retrieved
video shot, by motion trajectories of individual object, in a small displays. This
system needs video shots as the inputs whose motion vectors are extracted by using
exhaustive search algorithm. This shot-level motion feature is linked across the
consecutive frames of shot to form the motion trajectories. Remove redundant trajectories
and preserve one motion trajectory from all the similar motion trajectories. The
representative object motion trajectory is stored in a database. Query interface
which allows users to search for similar video shots by giving query video clip
as input. Similarity matching algorithm is used to retrieve similar video shot from
the database by comparing their motion trajectories. In this paper, next, in order
to display those retrieved video shots in a small display, shape information of
moving objects are extracted using Region- Growing algorithm. The segmented foreground
is scaled down and re-integrated with the repaired and directly resized background
to deliver effective video shot for small displays.
Keywords: Motion Trajectory, Exhaustive Search algorithm, Douglas
- Peucker algorithm, Region Growing algorithm, Content-based video retrieval (CBVR).
|
P. Jesu Jayarin, Dr. T. Ravi
|
Volume-3 Issue-1 - A Font and Size Iocr for Machine Printed Gujarati Numerals
View PDF Abstract
Character recognition is major research area since its inspiration. So, far very
limited progress has been made in it, specifically for Indian languages. Recognition
of Gujarati script is a less studied area and no significant attempt is made so
far to recognize Gujarati glyphs. In this paper we have presented a simple yet robust
solution for recognition of offline multi-font computer generated and machine printed
Gujarati Numerals. Pursued by the pre-processing techniques, we used a method called
correlation based template matching where a numeral is identified by analyzing its
shape and comparing its features that distinguish each numeral. The system appears
to be very robust against font variations and large shape variations.
Keywords: Template, Correlation, Segmentation, Normalization, Probability,
etc.
|
Shailesh A. Chaudhari, Dr. Ravi M. Gulati
|
Volume-3 Issue-1 - A Multi way Acknowledgment Protocol to Detect Misbehaving Nodes
in Manets
View PDF Abstract
Quality Function Deployment (QFD) is a product development process that encompasses
a sheer amount of data gathered from customers through several market research techniques
like personal interview, focus groups, surveys, video conferencing etc. This massive,
unsorted and unstructured data is required to be transformed into a limited number
of structured information to represent the actual 'Customer Needs'. However the
process is tedious and time consuming and cannot be dealt with manually. In order
to address these issues, this paper proposes a futuristic software framework based
on an Affinity Process. The paper begins with the topic introduction and outlines
the QFD process. The paper then describes the Affinity Process, builds the data
structure and then makes an attempt to build the proposed framework using tools
Visual Basic (VB) and MS-Access. The proposed framework is developed as a part of
QFD software and it is anticipated that when completely developed, it would act
as a vital component of QFD software.
Keywords: QFD, Affinity Process, Visual Basic, MS-Access, Software,
Customer Needs.
|
S. Usha, Dr. S. Radha
|
Volume-3 Issue-1 - Biological Data Integration using Virtual Database
View PDF Abstract
Biological data integration is considered to be one the most important and challenging
tasks in bioinformatics. The scientific achievements greatly depend on the integrated
view of largely diverse set of data. Biological data reside in hundreds of database
and there is no single database providing an integrated view of data. It greatly
invokes the need of data integration. Though they are different approaches for data
integration like data warehouse, federation, webservices; each has its own pros
and cons and challenges of implementation. In this research paper, we have proposed
a framework using virtual database to integrate different biological data sources.
Keywords: Data Integration, Data Warehouse, Data Federation, Webservice,
virtual database
|
Ateet Mehta, Dr. Kalpesh Lad, Dr. Bankim Patel
|
Volume-3 Issue-1 - Volatility Analysis of National Stock Exchange of India
View PDF Abstract
The paper investigates the nature and pattern of Volatility of National Stock Exchange
(NSE)'s price index namely S & P CNX Nifty. The data include daily observations
for NSE price index covering period from 1st January, 2000 to 10th September, 2007.
Various volatility estimators and diagnostics tests suggest certain stylized facts
about volatility like volatility clustering, mean reverting and asymmetry. Lagrange
Multiplier test indicates the presence of ARCH effect in the stock market. The paper
applies family of ARCH models to examine the asymmetric volatility of the NSE. We
find that first order GARCH model fits the data better than high order ARCH models.
Our analysis suggests that the EGARCH and TARCH models outperform the conventional
symmetrical GARCH models. The estimated TARCH and EGARCH parameters show that the
impact of news is asymmetric, indicating there is an existence of leverage effect
in future price of the stock. The Leverage effect is captured well by TARCH models
in Nifty. Application of ARCH-M models found no strong evidence for high return
during the period of high volatility.
Keywords: Volatility, Volatility clustering, ARCH, GARCH, EGARCH,
TARCH, ARCH-M, JEL Classification: C22, C52
|
Dr. Prashant Joshi
|
Volume-3 Issue-2 - An Efficient Intrusion Detection System using Computational Intelligence
View PDF Abstract
Intrusion detection system is one of the widely used tools for defense in Computer
Networks. In literature, plenty of research is published on Intrusion Detection
Systems. In this paper we present a survey of Intrusion Detection Systems. We survey
the existing types, techniques and approaches of Intrusion Detection Systems in
the literature. Finally we propose a new architecture for Intrusion Detection System
and outline the present research challenges and issues in Intrusion Detection System.
Keywords: Intrusion Detection, Neural Network, Fuzzy logic, Artificial
Intelligence, Honeypot, Data mining.
|
J. Visumathi, Dr. K. L. Shunmuganathan
|
Volume-3 Issue-2 -Using XML Schema for platform-neutral structure transfer
View PDF Abstract
Extensible Markup Language (XML) is a simple, very flexible text format, originally
designed to meet the challenges of large-scale electronic publishing. XML is also
playing an increasingly important role in the exchange of a wide variety of data
on the Web and elsewhere[1]. XML's design goals emphasize simplicity, generality,
and usability over the Internet[2]. Although XML's design focuses on documents,
it is widely used for the representation of arbitrary data structures through the
use of Schema. XML provides a widely adopted standard for representing text and
data in a format that can be processed without much human or machine intelligence.
Information formatted in XML can be exchanged across platforms, languages, and applications,
and can be used with a wide range of development tools and utilities. Looking at
the huge list of applications and advantages of using XML over proprietary relational
databases for platform-neutral data representation, here is an attempt to study
and evaluate the XML Schema representation of Microsoft Access 2003 and its ability
to transfer all the properties of the table.
Keywords: XML, Schema, XML converter, Microsoft Access 2003.
|
Dr. P. V. Virparia, Nehal Daulatjada, Priya Swaminarayan, Dr. V. R. Rathod
|
Volume-3 Issue-2 - Optimizing WLAN Design for E-Learning Environment
View PDF Abstract
This paper presents a WLAN model optimization rules and simulated results for ELearning
Environment like universities and other academic institutions. WLAN design with
mathematical approach for coverage and access point placement estimation is discussed
in the paper. WLAN throughput and workstation load estimation is calculated considering
E-learning environment where voice and video data transmission is major role player.
Keywords:
WLAN, HTTP, FTP,802.11 b/g/n ,Streaming services
|
Nayan V. Jobanputra, Dr. Nikesh A Shah
|
Volume-3 Issue-2 - An Algorithm to implement Dynamic Access Control using Anomaly
based Detection with VLAN Steering
View PDF Abstract
deployed in thousands of computer networks worldwide. The basic difference between
detection and prevention technique lies in how it provides protection for network
environments. An IDS monitors logged data and compares it with attack signatures
to detect unwanted access. For such identification, IDS normally uses signatures
or any unique characteristics of such attacks.
In this paper, we have designed an algorithm to achieve dynamic access control.
Dynamic access control requires implementation of three functionalities: traffic
monitoring, validation and policy enforcement. In this algorithm, traffic monitoring
and validation is done using anomaly based detection during access. For policy enforcement
and preventing attacks, we have chosen VLAN Steering method. The reason for choosing
VLAN steering is that it can be used with both out-of-band approach as well as in-band
approach also. We need to implement both approaches to achieve access control dynamically.
It helps to prevent insider as well as outsider attacks to a network. To prove the
concept of blocking a malicious host after it is successfully admitted in a network,
we present an example and a working algorithm for anomaly based detection. This
algorithm uses IDS logged data from database for traffic monitoring and validation.
It also updates signatures stored in signature database. An IPS sensor helps perform
VLAN Steering in our system for quarantining suspicious hosts.
|
Shalvi Dave, Dr. Bhushan Trivedi
|
Volume-3 Issue-2 - Precision and Real-Time Irrigation Control using Wireless Sensor
Networks: Opportunities, Issues and Challenges
View PDF Abstract
Water is considered as one of the most precious resources to be managed in order
to maintain the ecological balance of earth. Irrigation consumes a major portion
of water demand and hence due attention is needed for improving irrigation techniques
and estimating actual demand of water for irrigation. The present paper describes
several moisture measurement techniques, their limitations, proposes and justifies
a solution of using wireless sensor networks in working out the exact water demand
in controlled irrigation. It also highlights the major constraints of WSN for the
proposed solution.
Keywords: Irrigation, soil moisture measurement, wireless sensor
networks.
|
Kamlendu Kumar Pandey, Dr. S. V. Patel
|
Volume-3 Issue-2 - A Study on Export Import procedures at Visakhapatnam Port Trust
View PDF Abstract
Visakhapatnam port is one of the leading major port of India has been playing a
vital role in fostering the country's foreign trade, economic development and national
development. In this paper an attempt has been made to focus on exports and import
facilities, tariff rates, problems, present EXIM policy, major exports and imports.
The above variables are tested by applying chi-square test in relation to year of
establishment, operational area and type of firms. The results showed that 90 per
cent of the respondents need additional warehousing and storage facilities at VPT.
A little more than fifty per cent of the respondents need improvement in export
and import facilities. All the respondents are facing different types of problems
at VPT. To overcome their problems port authorities should take necessary steps.
It is interesting to note that none of the respondents revealed about poor performance
of the Andhra Pradesh government regarding export and import policies. The major
exports are iron ore, engineering items, liquid cargo etc. The imports are edible
oil, crude oil, chemicals, fertilizers etc. Nearly 50 per cent of the respondent's
revealed that the exports are more helpful for the country.
|
Dr. D. M. Sheaba Rani, Dr. K. Hari Hara Raju
|
Volume-3 Issue-2 - Improved Detection of Dos Attacks using Intelligent Computation
Techniques
View PDF Abstract
IDSs play a principal role in pro-actively detecting intrusions into enterprise-level
computer networks, therefore the accuracy with which it performs this vital function
is of paramount importance. Many studies have previously been conducted to improve
upon proper classification of detections using neural networks and machine learning
algorithms. We try to compare the performance of various intelligent computation
techniques like Bayesian networks, Naive Bayesian, Logistic regression, RBF networks,
Multi-Layer perception, SVMs with the SMO model, Kth nearest neighbour and Random
forest in detecting DoS attack patterns. The data that was used to train and validate
these techniques was obtained from the MIT Lincoln lab study into IDSs. The results
obtained provide a clear comparison of the individual intelligent computation techniques
ability in identifying and classifying attack patterns.
Keywords: Networks, intrusion detection, denial of service, datasets,
data mining, Bayesian networks, Naive Bayesian, Logistic regression, RBF networks,
Multi-layer perception, Support vector machines, Sequential minimal optimization,
Kth nearest
|
J. Visumathi, Dr. K. L. Shunmuganathan
|
Volume-3 Issue-2 - Enterprise Application Integration (EAI) - A Changing Landscape
of an IT Organization
View PDF Abstract
EAI is really a backlash to traditional distributed computing. Now that generations
of developers have built these automation islands of information and business processing,
more users and business managers are demanding seamless bridges between them. While
the business case is clear and easy to define, the task of implementing a solution
is not. There is a change in the landscape from building stand-alone applications
to building integrated applications that can take advantage of data and business
processes from legacy systems. This paper will try to provide the background necessary
to understand Enterprise Application Integration and the challenges customers face
in moving towards an integrated solution.
|
Dr. Sanjay H Buch
|
Volume-3 Issue-2 - Impact of FDI on Indian External Sector
View PDF Abstract
An empirical assessment of the Impact of foreign direct investment (FDI) in a host
country's external sectors performance is important, since external sector have
been for a long time viewed as an engine of economic growth. FDI promotes export
and import by facilitating India access to new and larger markets. The present study
examines and analyzes the impact of FDI on the Indian external sector between 1970-71
and 2008-09. Ordinary least square (OLS) regressions and the empirical analysis
are conducted by using annual data on FDI inflow, Total Export, Total Import and
Foreign Exchange Reserve in India over the 1970-71 to 2008-09 periods. There is
sufficient evidence to show that there are significant relationship between Indian
External sector and foreign direct investment inflows (FDI) in India. FDI has direct
positive impact on Export, Import and Foreign Exchange Reserve, which FDI rate increase
by 1% will lead to increase by 4.1% in total export with no autocorrelation, while
45% in total import and 61% in foreign exchange reserve with presence of autcorrelation.
It is also observed that the impact of FDI on opportunities for domestic business
and economic activities is positive and net attitudes of foreign firms toward FDI.
Key Words: FDI inflows; Export; Import, Foreign Exchange Reserve,
Regression Analysis
|
Vijay Gondaliya
|