The first image in a slideshow demo.
Announcements
ijmcs  IJMCS

Volume-5 & Issue-4 Published (Acceptance Ratio=47.36%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-5 & Issue-3 Published (Acceptance Ratio=38.88%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-5 & Issue-2 Published (Acceptance Ratio=25%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Submit your Paper Last date 20th June 2017 http://www.ijmcs.info/submit_your_paper

ijmcs  IJMCS

Submit your paper @ Free of Cost http://www.ijmcs.info/publication_fee...

ijmcs  IJMCS

Volume-5 & Issue-1 Published (Acceptance Ratio=28.94%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-4 & Issue-5 Published (Acceptance Ratio=41.42%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-4 & Issue-4 Published (Acceptance Ratio=54.02%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-4 & Issue-3 Published (Acceptance Ratio=48.05%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-4 & Issue-2 Published (Acceptance Ratio=44%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-4 & Issue-1 Published (Acceptance Ratio=50%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Volume-3 & Issue-2 Published (Acceptance Ratio=34%)http://www.ijmcs.info/current_issue

ijmcs  IJMCS

Submit your Paper Last date 10th October 2015 http://www.ijmcs.info/submit_your_paper

ijmcs  IJMCS

Volume-2 & Issue-6 Published (Acceptance Ratio=69.56%) http://www.ijmcs.info/current_issue

ijmcs  IJMCS October

Volume-2 & Issue-5 Published (Acceptance Ratio=45.94%) http://www.ijmcs.info/current_issue

ijmcs  IJMCS 15 August

Upcoming Isuue: Volume-2 & Issue-6 http://www.ijmcs.info/next_issue

ijmcs  IJMCS

Volume-2 & Issue-4 Published (Acceptance Ratio=25%) http://www.ijmcs.info/current_issue

ijmcs  IJMCS

National Conference on (Advances in Modern Computing and Application Trends) Organised By Acharya Institute of Technology Bangalore, India

  • For More Detail Click Here
  • http://www.acharya.ac.in/Techman_2014.pdf

    Volume-4 Issue-4(August 2016)
    Title: Task Scheduling: The Technological Changes
    Authors: Gaurav Chaudhary and Mr. Manish Mahajan
    Abstract: Cloud computing is creating as a perspective for immense scale data-oriented applications. Cloud computing is based on the headway of distributed and grid computing and virtualization. As cost of every job in cloud resources is assorted with each other, scheduling of jobs/tasksis not same as in standard scheduling techniques. Job/Task scheduling is a vital and critical problem of Cloud computing. Everyjob/task scheduler in Cloud computing have to meet the satisfaction cloud customers with the doled outQoS and upgrade advantages of cloud suppliers. Developments are important to ride the unavoidable tide of progress. The time of cloud based multimedia applications has prompted a tremendous increment in the no. of requests on cloud. The expanded no. of requests on cloud prompts an expanded workload, making workload balancing a vital QoS Parameter. Workload Balancing additionally prompts a sensible utilization of assets like electricity and so on and therefore advances the idea of Green IT. Job/task scheduling Cloud computing is a NP-hard issue for which, various methods and models have been developed and proposed, ranging from low level to high level execution of tasks in multiple processors. In this paper different task scheduling techniques are said that have been made over the earlier years to meet the vital QoS and changing customer needs.
    Click Here to Download Pdf
    Title: A Data mining Classification for Heart disease diagnosis using First Order Logical Decision
    Authors: E. Santhiya and M. Praveena
    Abstract: The Healthcare commerce is generally “information wealthy”, which is not possible to handle physically. These huge amounts of data are extremely important in the area of Data Mining to retrieve a useful information and make associations amongst the attributes. In this research paper aims to present a comparative study of present techniques of knowledge discovery in records using data mining techniques that are in use in today's medical research particularly in Heart Disease Prediction. Number of experiment has been performed to evaluate the performance of predictive data mining technique on the heart disease and the outcome reveals that First Order Logical Decision (FOLD) Tree outperforms having similar accuracy as of decision tree but other predictive methods like Fuzzy rule classification, Neural Networks, Associate Classification based are not performing well.
    Click Here to Download Pdf
    Title: Constructive Heuristic Load Balancing and lightweight dynamic channel allocation mechanisms for Cluster-based Mobile Ad Hoc Networks
    Authors: P.Bharathi and R.Maruthaveni
    Abstract: Efficient allocation of wireless communication in dynamic way is significant for the performance of wireless mobile computing systems. An extensive model is shown that both dynamic path allocation and cooperative load matching improve the bandwidth efficiency under heterogeneous load distributions to the IEEE 802.11 protocol. The aim is to achieve good scalability, extensive network duration and low down data gathering latency. In this paper attempts to provide the inter-cluster broadcast, cluster head (CH) formation are ahead to base station for its moving route planning. The proposed Heuristic based iterative dynamic sub channel algorithm finds the efficient path to transmission between the nodes are executed by network in a centralized way. The proposed model utilizes in distributed approach of load balanced clustering and twofold base station, which is referred to as Heuristic load balancing (HLD).
    Click Here to Download Pdf
    Title: An Efficient Dynamic Privacy-Preserving Secure Active Link State Protocol Using Semi-Honest Model
    Authors: S. Divya and D. Kalaivani
    Abstract: In dynamic wireless network connection based path failure problems and malicious packets declining are important factors for packet drops in multi-hop wireless ad-hoc network. The packet losses time case are corresponding to the network channel allocation defect rate is based on observing the packet drop rate does not reached acceptable reliability. To improve the reliability, in this paper proposed a novel privacy preserving method to develop the relationships between path failure and packet drops. Meanwhile the proposed system to find the correlations, extend a Active link state routing protocol (ALSRP) based semi-honest model privacy preserving protocol framework design that allows to validate the reliability of the packet (message) drops information by mobile nodes. Through the simulations, the system verifies and attains extensively enhanced learning accuracy across the existing techniques such as homomorphism linear authenticator based detection.
    Click Here to Download Pdf
    Title: A Dynamic high dimensional data Clustering using subspace clustering Algorithm
    Authors: M. Kanchana and M. Mohanraj
    Abstract: Subspace clustering refers to the assignment of discovering a multi-subspace demonstration that best it's a group of points taken from a high-dimensional space. In this paper presents a Subspace clustering algorithms localize the find for appropriate dimensions permitting them to find clusters that exist in multiple, possibly overlapping subspaces. This papers objective is to present a comparative measures of present techniques of data mining in accounts using clustering techniques that are in use in High-dimensional data. Number of test has been performed to evaluate the presentation of projecting clustering technique on the high-dimensional data the performance reveals that Fuzzy Sparse Subspace Clustering (FSSC) and Spectral Clustering (SC) algorithms having better accuracy of other predictive methods like Algebraic Algorithms, Iterative methods and Agglomerative Lossy Compression (ALC) methods are not performing well.
    Click Here to Download Pdf
    Title: Performance Evaluation of TCP alternatives in MANET using Reactive Routing Protocol
    Authors: Suneel Kumar Duvvuri and Dr. S. Rama Krishna
    Abstract: A Mobile Ad hoc Network is a network that consists of self determining or many free wireless devices which can be setup immediately anywhere and anytime without needing network infrastructure. In MANET mobile devices are connected through wireless channel and freely move randomly. Each mobile node often acts as a router for other nodes for establishing paths between hosts. Transmission Control Protocol (TCP) is implemented in the transport layer to facilitate reliable transmission of data. Because of explicit notable characteristics of a MANET, several modifications have been proposed for TCP to improve the performance. The reputed variants of TCP are TCP Tahoe, TCP Reno, TCP New Reno, Sack, Fack, and Vegas. MANET utilizes TCP and UDP for data transmission and this paper focuses on various congestion control and avoidance mechanisms which have been proposed for TCP/IP protocols explicitly using AODV.
    Click Here to Download Pdf
    Title: Image Thinning using Fuzzy Morphology
    Authors: Dillip Ranjan Nayak
    Abstract: Thinning is basically reducing a ‘thick’ digital object to ‘thin’ skeleton. It is one of the most frequently used methods to find the geometrical structure of objects. In this paper we purpose a method for thinning of binary image in using the morphological Hit-or-Miss Transform (HMT) and Gaussian fuzzy function. Hit-or-Miss transform is a general binary morphological operation that can be used in searching of particular patterns of foreground and background pixels in an image.
    Click Here to Download Pdf
    Title: Review on Data Deduplication and Secured Auditing of Data on Cloud
    Authors: Lakshmi Hunashikatti and Prof. P.M.Pujar
    Abstract: With the unlimited advancement in cloud computing area in the course of recent years, stockpiling of information to cloud has turned out to be an attractive trend; with this the client can easily store and maintain information. The point is to propose two safe frameworks namely SecCloud and SecCloud+ which can achieve both deduplication and secure auditing of data in cloud. The SecCloud framework involves auditor which helps in partitioning the record into blocks, every block is assigned a digital signature and is checked for its integrity before uploading it on the cloud. The design of SecCloud+ enables secure deduplication and data auditing on encoded data as customers reliably need to scramble their data before transferring to preserve file confidentiality.
    Click Here to Download Pdf
    Title: Cryptography Algorithms for Secure Communication: A Survey
    Authors: Ms. Mayuri Wagh
    Abstract: Security is a exclusive and primary concept to get insure about the performance and reliability of execution with trust factor. It gives isolated environment from public network during communication. Security features gives guarantee about safety of information and communication from unauthorized access and usage. It can be state as the tool or mechanism to maintain privacy and protection of information from hazardous situation. Security is the common usual requirement and need to integrate with any type of application. To obtain the better integration of security feature six different security principles has been defined. Here, Confidentiality, authentication and integrity are the major principles and mandatorily integrate with application. This paper consist the basic overview of various security principles there need and existing security mechanism. It also investigates the problem into existing system and advises to rectify with advance solution.
    Click Here to Download Pdf
    Title: Security Attacks in Mobile Ad Hoc Networks: A brief overview
    Authors: A. Gowsi Monica and A. Indhumathi
    Abstract: Security is a necessaryconstraint in wireless ad hoc network. Evaluated to wired networks, wireless ad hoc network are more susceptible to protection attacks due to the need of a confidential centralized authority and partial resources. One of the major issuesin wireless adhoc networks face today is security. The Security Attacks can be categorized as passive and active attacks, depending on whether the typical process of the network is interrupted or not. The main objective of the paper is to present different kinds of Security attacks, their effects and defence methods in Wireless adhoc Network which is susceptible to security attacks and threats due to its characteristics and limitations.In this paper assists researchers to have a very strong initiative about the security issues, active attacks and they can also use the facts and ideas to make more secure wireless network system in future. A motivation can be obtained to expand new security methods to protect new possible attacks along with existing ones.
    Click Here to Download Pdf
    Title: Self Adaptive Routing with Smart Bandwidth Utilization for WSANs
    Authors: Gurpreet Singh and Dr. Bikrampal Kaur
    Abstract: Provisioning network survivability is crucial in wireless sensor and actor network (WSAN) because nodes deployed in hostile environments are vulnerable to common failures. Failure of an actor appreciably effect actor linked coverage that's vital for powerful network operation. Existing mobility-based recovery schemes are both geared towards restoring inter-actor connectivity or network coverage. None of them consider sustaining actor coverage (i.e., having sensors reachable to actors) while restoring inter-actor connectivity. This paper presents balanced topology to manage lifetime and load over WSAN.The simulation is aimed at evaluating the performance of the wireless sensor networks (WSNs) and wireless sensor and actor networks (WSAN) in the proposed simulation. The proposed model is being evaluated for the execution time, energy consumption, overhead, load, delay and many other related parameters. The WSN and WSAN are being implemented in a balanced topology along with an adequate number of sensor nodes. The sensor nodes will be evaluated under various scenarios.
    Click Here to Download Pdf
    Title: Digital Image Watermarking from Past to Future: An Overview
    Authors: M.Mangaiyarkarasi
    Abstract: Digital image watermarking is extensively used for copyright protection, developed to assist authentication of data and security of digital information. The robustness of embedded watermarks against diverse attacks is main advantage of watermarking technique. So this is the reason why robustness is preferred by watermarking algorithms. It is impossible for us to eliminate the watermark by the robust algorithm without severe degradation of the main content. This paper presents an analysis on different watermarking techniques and its properties. The main cause for progress of digital watermarking research is to preserve intellectual properties of the digital world. Because of the recent technological development it makes easy by copying the digital contents with no limitations and editing without any excessive professional efforts. In lack of protecting techniques, it is complex to pertain communication systems for secure business, military and medical applications. This main objective of this paper is to present a view on different digital watermarking properties, techniques, applications and attacks.
    Click Here to Download Pdf
    Title: Geographical Path for Multi-Radio Networks Using MANETS
    Authors: Mr. Anand M
    Abstract: MANETs are provided with multiple radios which will be considered as the basis of future force communication networks envisioned for use in the war field applications. In this paper, we propose a new geographical path protocol for such a different multi-radio, multi-band tactical MANETS. This new protocol is designed to use many routing metric with different many radio interfaces such as IF0,IF1,IF2 each of which runs on different range of frequency bands. This protocol gives high quality of service comparing to the existing GPRS technology. Simulation of the protocol results using the JProwler platform, java coding with swings, Netbeans IDE tool and windows OS.
    Click Here to Download Pdf
    Title: Survey of Vehicular Ad-hoc Networks (VANETs) With Security Requirements
    Authors: A.Prabhaharan and K.Vijayakumar
    Abstract: Vehicular Ad-hoc Networks (VANETs) are the most prominent enabling network technology for Intelligent Transportation Systems. The aim of VANET ranges from improved road safety, time critical safety applications, and optimized traffic flow, delay tolerant, and infotainment and so on. High relative nodes pace and high active nodes mass has offered a typical challenges for connectivity within VANET.A lot of researches from the industry, academia, and standardized agencies to develop standards and prototypes for vehicular networks. This paper provides a summary of the recent state of the art of VANETs; it presents the communication architecture of VANETs and outlines the privacy and security challenges that need to be overcome to make such networks safety usable in practice. The challenges facing ad hoc networks are the topology of the network changes rapidly. Vehicles in a VANET have a high degree of mobility. The average length of time that two vehicles are in direct communication range with each other is approximately one minute.
    Click Here to Download Pdf
    Title: Towards the Next Generation: Apache Spark Replaces the Challenges of Apache Hadoop in Big Data
    Authors: R.Sandrilla
    Abstract: Today Web has progressed and huge amount of Data has been evolved. The World Wide Web has developed to a tremendous growth with the leading source of facts and made the world to see unprecedented data growth in recent years. This development of the WWW made the Big Data in reality. The Big Data has become a more fascinating topic for the computer learning supporters around the world. Therefore facing Big Data has become a great challenge. This paper discusses towards the next generation incrusting the challenges of Big Data. Though Map reduce acts as a key for facing the unceasing demands which enforces substantial data sets. The development Of Apache Spark has replaced the challenges of Hadoop in meeting the continuous demands. Therefore a complete comparison is done between the Apache Hadoop and the recently advanced Apache Spark. Even though they have the same opinion in processing the concept of Big Data, they differ in few variations. The usefulness and the effectiveness of the both is comparatively diffentiated and described. This broadsheet gives a glimpse of knowledge about the complete working of Map Reduce and Apache Spark with their Challenges in Applications and the great hype of Apache Spark in few months. Finally concluded as apache Spark the Promising technology not only supports Hadoop enormously rather it is an alternative, highly performing, Scalable and a magnificent way of providing ease of programming to the Data Science.
    Click Here to Download Pdf
    Title: MongoDB: An Innovative Database Approach
    Authors: Gunjan Soni and Twinkle Soni
    Abstract: On taking emphasis on challenging demand of storing and maintaining large amount of data,we present a highly secured and efficient database that is called as MongoDB. This paper presents a new approach to store and manage the huge amount of data.Now a days, due to agile development approaches where we need each time a change in database, NoSQL databases have surged in popularity. This database gives better performancethan SQL databases.In this paper we aimed to highlight the advantages of a non-SQL database as MongoDB over other SQL databases.We describe the structure of document–oriented databasewhich has no predefined schema and MongoDB supports this feature. To store more sensitive information, MongoDB provides a very high level of security. In this database system, we also compare operations on key-value stores implemented by NoSQL that is MongoDBwith SQL databases like relational db(e.g. MYSQL).
    Click Here to Download Pdf
    Title: Cloud Security: DDoS Defense Mechanisms
    Authors: Sandipan Basu and Sunirmal Khatua
    Abstract: Security has been a major issue in the field of computer networking, and with no exception it remains a major issue in the field of cloud computing too. This paper focuses on security issues related to cloud systems. Different types and number of threats has been increasing, since malicious users/attackers have been trying to get access to the cloud services/resources in an unauthorized manner. Moreover, they are not limited to this; rather, their malicious activities prevent authorized cloud users from getting access to the resources/services that they deserve. In this respect, one of the most interesting and challenging security issues is - DDoS (Distributed Denial of Service) attack, which is more fatal in case of cloud system. In this paper, we will look upon some of the popular DDoS attacks and propose defense mechanisms that will increase the security level of a network.
    Click Here to Download Pdf
    Title: A Literature Review on High Dimensional Data Classification Techniques in Data Mining
    Authors: Ms.N.Gayathri and Ms.K.Yemuna Rane
    Abstract: Data mining is a process of gathering knowledge from such massive data. In data mining Classification is a (machine learning) procedure used to calculate the group association for data instances. Classification is a representation of finding a process that is used for dividing the data into many classes according to some conditions. This survey papers is describes a high Dimensional Data Classification response to such needs to presents a evaluation of conventional classification techniques used for data mining. The major consideration is on data classification techniques like instance based learning, Feature selection process, Features Annealed Independence Rules (FAIR) classification, Support vector machine classification techniques which are used to mine databases. The objective of this survey is to provide a complete review of different data classification techniques in data mining.
    Click Here to Download Pdf
    Title: Structural Health Monitoring Using Image Processing Techniques- A Review
    Authors: Dr. Arabinda Sharma and Neeraj Mehta
    Abstract: Automated health monitoring and maintenance of civil infrastructure systems is an active yet challenging area of research. Identifying appropriate applications for technology to assess the health and safety of structures is an important issue around the world. For example, current infrastructure inspection standards require an inspector to travel to a target structure site and visually assess the structure’s condition. A less time-consuming and inexpensive alternative to current monitoring methods is to use a robotic system that could inspect structures more frequently, and perform autonomous damage detection. Among several possible techniques is the use of optical instrumentation (e.g., digital cameras). The feasibility of using image processing techniques to detect deterioration in structures has been widely investigated by many researchers in the field. Traditionally, structure conditions have been monitored through visual inspection methods with structural deficiencies being manually identified and classified by qualified engineers and inspectors. More advanced and objective bridge inspections are required for monitoring long-term bridge performance. This paper deals with the inspection methods using high resolution digital image. The stages of increasing difficulty that require the knowledge of previous stages, namely: 1) Detecting the existence of the damage on the infrastructure 2) Locating the damage 3) Identifying the types of damage 4) Quantifying the severity of the damage
    Click Here to Download Pdf
    Title: A Hybrid Optimization approach to Eliminate Duplicate File Using Hashing Technique in Cloud Storage
    Authors: Rohini Sharma & Miss Amritpal Kaur
    Abstract: Clouds are the bigger pools of easy to use and access virtualized properties. Cloud computing offers an on demand, self-service, speedy, adaptable and universal access to several computing properties and resources. Data de-duplication is information-compression techniques which reduces the storage capacity by eliminating duplicate copies of information or reduce the sum of information that has to be transfer over a complex. In this paper we propose Hybrid optimization approach by combining the SHA-3 algorithm to generate the Hash Value in the form of string. K-mean clustering algorithm used for divide the data into the hash values. Genetic algorithm used for optimize to find the best solutions from the hash values for unique data. It is an interactive program which provides numerical computation and visualization of data. The parameters on which the comparison is done are memory consumption, hashing time, detection time and detection accuracy.
    Click Here to Download Pdf
    Title: Data Mining in Agriculture- A Survey
    Authors: N.Neelaveni
    Abstract: Data mining is used to fetch the needed information from large database. Now a day’s data mining concept and techniques used to resolve the agriculture problems. Here we discussed about how data mining techniques are applied in agriculture field.
    Click Here to Download Pdf
    Title: Review Paper on Load - Balancing for fault tolerance system in Cloud computing architecture
    Authors: Navjot Kaur and Bharti Chhabra
    Abstract: Cloud computing is the term used for model which provide convenient and on-demand network access to a shared pool of computing resources, applications, storage and services, which are stored on a remote location that can be rapidly released with minimum interaction in a well-organized way. A typical cloud consists of several elements such as clients (nodes), virtual machine and virtual server. Virtual machine and virtual server are used to distribute the load among the nodes and enhance the network throughput, reduce execution time and battery consumption. When the load is not equally distributed among the nodes, chance of error occurrences will be increased. The fault tolerance approach is required for reducing the number of error rates in mobile distributed network. The task allocation model is used to allocate tasks to various mobile nodes. In this modal tasks among processors are allocated on the basis of capacities of processors and communication links. The issue of failure can be resolved by task redundancy, which is provided by backup system that is attached with each node of the remote cloud distributed systems. This paper focuses on designing algorithms for making the system fault tolerant, balancing the load among the nodes and help to recover the fault in minimum amount of time.
    Click Here to Download Pdf
    Title: Weather Forecasting – A Survey
    Authors: P. Kalaiselvi
    Abstract: Weather is one of the effective environmental constraints in our lives. Weatherpredictionhasbecomeoneof themost exhibitingandrelating tochallenging problemsinthe world.Weather warnings are important forecasts because they are used to protect life and property.Accurate weather prediction has been one of the most challenging problems cross the world. Thispaper, we inquire the use of data mining methods and techniques in forecasting maximum temperature, rainfall,evaporation and wind speed. Awidevariety of weatherforecastmethodsare available and the work that has been completed by several researchers in this field has been reviewed.The resultsshowthatgiven enough casedata,Data Mining methods, forecast types and techniques can be used for weather forecasting andclimate change studies
    Click Here to Download Pdf
    Title: A Load Analysis Paradigm using Content Based Model in Cloud Environment
    Authors: Chandini M.S, Mrs. Pushpalatha R. and Dr. Ramesh Boraiah
    Abstract: Considering tremendous increase in arrival rate of live content, resulted in great challenge on substitute systems regarding the distribution of large-scale real content for appropriate users in a reliable and scalable manner. The CBM model is used for data distribution because of its ability to expand system to huge size. Existing pub/sub systems that provides event matching services exhibit either minimum matching through-put upon matching large number of subscriptions or interrupt distribution secs though a large number of servers stops service all together.. In this paper, a scalable and reliable event matching service called CBM is proposed for content-based pub/sub systems in cloud computing environment. In CBM, to orgranise servers a distributed overlay skip cloud is proposed thus it achieve low routing latency and reliable links among servers. Using hybrid space partitioning technique HPartition, large-scale twisted subscriptions are drawn in-to multiple sub-spaces, that guarantees increasing mapping through-put & provides multiple data centers for every event. performance of CBM is evaluated by deploying 64 servers and much more live-content objects are subjected to experiments on CloudStacktestbed. The experimental results validate routing events load in SkipCloud to 60percent lesser of Chord connexion. Matching rate in CBM is of 3.7x & 40.4x more of BlueDove. Also, CBM supports query loss at a rate 0 in tens of secs though a large number of servers stops service simultaneously.
    Click Here to Download Pdf
    Title: Overview of different text similarity methods in Research Document
    Authors: R.Neethu and S.Vydehi
    Abstract: Plagiarism of digital documents are rapidly increasing and having some difficulties in manual detection, need an automatic system to detect the similarity between two documents. There are number of algorithms are used to check the plagiarism. Measuring the similarity is applicable in the areas such as clustering the document, information retrieval and plagiarism checking. The text categorization must be efficient, space and time complexity must be reduced. Mainly there are two types of plagiarism detection-intrinsic plagiarism detection and external plagiarism detection. Here we are discussing about the intrinsic plagiarism techniques. For measuring the document’s similarity, Ferret algorithm is used on map reduce and Hadoop framework. In genetic algorithm, use similarity coefficient that represents the document similarity between the pair of documents. Many researchers coined their ideas in different studies and I reviewed some methods and discuss the working of these algorithms in this paper.
    Click Here to Download Pdf
    Title: An Optimized hashing based De-Duplication process for utilize Memory Consumption over a Cloud Storage
    Authors: Manpreet Kaur and Daljit Kaur
    Abstract: The basic goal of data de-duplication is finding out duplication within data without affecting its integrity. As its main objective is to identify duplicate data and store only single copy of each instance. Although various methods were used but still few have their own shortcomings such as more memory consumption, more time and less accuracy. So, in this paper we make use of combination of various algorithm such as Message Digest Algorithm (MD5) to provide more security and checking the integrity of files and Secure Hash Algorithm(SHA3) aims at reducing the memory requirement and time consumption for data de-duplication. By using these algorithms one must obtain better performance and significant results.
    Click Here to Download Pdf
    Title: A Reliable Vertical Handover Approach in Wireless Sensor Network using Re-association Procedures
    Authors: Sapna Saini, Naveen Sharma and Tanisha Saini
    Abstract: Wireless sensor network has always enrapture researchers to expand up the field by estimating and perceiving its potential in various application areas in which one of the main research challenge is nodes mobility. Nodes mobility basically deals with the network behavior and loss of mobile nodes. In this paper by focusing on the mentioned feature of nodes mobility, nodes should be able to self organize and auto configure which allows to minimize the data packet loss as well as retransmit it by keep check on message size so that these nodes can ensures efficient performance in terms of energy consumption ,reliability and throughput which is based upon the re-association procedure for a reliable vertical handover process that should be performed according to an adequate association criteria .The projected mechanism will be implemented with NS-2.It will also compute various performance parameters of proposed system.
    Click Here to Download Pdf
    Title: Clustering Based Energy Efficient Routing Protocol in Wireless Sensor Network: A Survey
    Authors: Sweety and Vinod Saroha
    Abstract: The past few years study shows that the potential use of WSN has been increased in applications like military, combat field reconnaissance, disaster management and security surveillance etc. In the most of the applications the large number of sensor motes is expected to be deployed remotely in gigantic area and all are operated independently in unattended environment. However, beside these applications there is major issue of power supply of nodes. Sensor nodes are battery-powered and infeasible to be replaced. The lifetime of WSN highly depends on the life of all sensor nodes. As nodes sense the data from environment and forwards it to sink it consumes energy in computing and forwarding of data or messages. This data forwarding should be done efficiently to minimize the use of energy. Various researches are made to find the efficient routing protocols to efficiently route data by aiming optimization of energy consumption. In this paper we have surveyed and discussed various routing schemes that aim to increase network lifetime.
    Click Here to Download Pdf
    Title: A Resource Aware Load Balancing Model in Cloud Computing with Multi-Objective Scheduling
    Authors: Kavita and Vikas Zandu
    Abstract: Cloud computing is a service based, on-demand, pay per use model consisting of an inter-connected and virtualizes resources delivered over internet. In cloud computing, usually there are number of jobs that need to be executed with the available resources to achieve optimal performance, least possible total time for completion, shortest response time, and efficient utilization of resources etc. Hence, job scheduling is the most important concern that aims to ensure that use’s requirement are properly and correctly satisfied by cloud infrastructure. Most of the task scheduling algorithm developed in cloud computing target single criteria which fails to provide efficient resource utilization. Soto enhances the system performance and increase resource utilization it is must to consider multiple criteria. This paper proposes a multi- objective scheduling algorithm that considers wide variety of attributes in cloud environment. The paper aims to improve the performance by reducing the load of a virtual machine (VM) by using Load Balancing Method. Finally, it optimizes the resource utilization by using Resource Aware Scheduling Algorithm involving combination of Min- max and Max-Min strategy.
    Click Here to Download Pdf
    Title: A Data Classification Model for achieving Data Confidentiality in Cloud Computing
    Authors: Kulwinder Kaur and Vikas Zandu
    Abstract: Cloud computing offers numerous benefits including scalability, availability and many services. It needs to address three main security issues: confidentiality, integrity and availability. It is on demand and pay per use service. But as this new technology expanding it also discovers new risks and vulnerabilities too. Security of the data in cloud computing is still an ultimatum to be achieved. The data owners must place their confidential information into the public cloud servers which are not within their trusted domains. Hence, security and privacy of knowledge is the major concern within the cloud computing. To overcome this, various security aspects of security issues has been analyzed and then a framework to mitigate security issues at the level authentication and storage level in cloud computing is proposed in this paper. Deciding data security approach for the data without understanding the security needs of the data is not a valid technical approach. Before applying any security on data in cloud, it is best to know the security needs of the data. That is, whether that kind of data need security or not. In this a data classification approach based on data confidentiality is proposed. The aim of classification technique is to classify the data based on the security needs into two classes confidential and non-confidential. The proposed approach identifies that part of the dataset which must be encrypted using machine learning algorithm and other doesn’t which will save time and cost of privacy preserving. After that efficient security mechanisms should be deployed by means of encryption, authentication, and authorization or by some other method to ensure the privacy of consumer’s data on cloud storage.
    Click Here to Download Pdf
    Title: Business Intelligence in Social Network-Survey
    Authors: P.Lissy and P.Sudha
    Abstract: In the recent year data mining research has lead to development of numerous efficient and scalable methods for mining interesting patterns in large database. Data mining is using various progresses in social networks. Social networks are accessible platforms for communicate, share, exchange information, ideas, videos, with all around as people as net. In this paper provide to overall understanding of business intelligence using social network analysis using Data mining and techniques. And this paper reviews basic of data mining tool and techniques used to give better decision making in BI (Business Intelligence).
    Click Here to Download Pdf
    Title: Image Splice Detection using Discrete Cosine Transform Features
    Authors: Chitwan Bhalla and Surbhi Gupta
    Abstract: In 21st century, the security of any digital image suffers from different kinds of threats due to the presence of many editing tools that can easily alter the image contents without even leaving any of the visible traces of changes made in the image that aims at image forgery. It includes two main approaches- Active and Passive. Active approach relates to watermark and signature technique. In watermark, a security structure is embedded as a watermark and in signature detection technique, signature is embedded into the image. Second is the passive approach in image forgery that has three types. First is copy-move forgery in which a part of the image is copied and pasted into another part of the image. Second is retouching that aims at changing the image on a whole. Third is image splicing that aims at a paste-up process produced by sticking together photographic images. In this paper an enhanced technique based on feature extraction is proposed for splice detection. Image is passed onto Local binary Pattern (LBP) where LBP code is generated by binarization which is next passed through two dimensional-discrete cosine transformation (2D-DCT) and after that features are extracted. The efficiency of algorithm is checked using five support vector machine (SVM) Kernels- Linear, Polynomial, Quadratic, Radial Basis Function and Multi-Layer Perceptron. In case of RBF Kernel, the accuracy obtained is 98.7. The numeric values for Recall and Precision is also calculated.
    Click Here to Download Pdf
    Title: A Review on Multi-Label Learning Techniques
    Authors: Palash V Karmore and A.M.Bagade
    Abstract: In a single-label learning problem, labels are considered to be mutually exclusive. Multi-label learning problems are different because labels in them are overlapping. Hence, a single instance of a problem in multi-label learning can belong to multiple classes. The examples of such problems can be web page classification, scene classification, medical diagnosis, image labeling, etc. This review paper first introduces multi-label learning, then discusses various algorithms and techniques available for performing multi-label learning applied to web page classification and then compares these algorithms on various parameters.
    Click Here to Download Pdf
    Title: Secure Switching Based Differential Evolution for Cluster Head Election in Wireless Sensor Network
    Authors: Sweety and Vinod Saroha
    Abstract: Wireless sensor networks have gained substantial interest to the researchers because of their wide range of applications in harsh and unattended environment. Efficient consumption of energy is the prime design issue for these networks. Clustering is an efficient method which balances the load of the sensor nodes and improves the overall scalability and the lifetime of the wireless sensor networks (WSNs). In A cluster based WSN, the cluster heads (CHs) dissipate more energy due to additional workload of receiving the sensed data from member nodes, data aggregation and transmission of aggregated data to the base station (BS). For load balancing, clustering algorithm rotate the responsibility of cluster head among the member nodes. Cluster head selection process is crucial for clustering algorithms. The performance of clustering is greatly persuaded by the selection of Cluster Heads (CHs). Finding the optimal set of CHs is known to be nondeterministic polynomial (NP)-hard problem for a WSN. Evolutionary computation approaches can be used to find fast and efficient solutions to such problems. Differential evolution (DE) has been proven to be a genuine evolutionary algorithm for comprehensive numerical optimization. In this paper, we proposed a Switching with Differential Evolution (S-DE) based clustering algorithm for WSNs to improve lifetime of the network. Simulation shows that the proposed solution finds the optimal cluster heads and has sustained network lifetime than the traditional clustering algorithms. The experimental results exhibit the efficiency of the proposed algorithm and its performance is compared with the Genetic algorithm based clustering algorithm.
    Click Here to Download Pdf
    Title: EDEEC-Enhanced Distributed Energy Efficient Clustering Protocol for Heterogeneous Wireless Sensor Network (WSN)
    Authors: Deepti Goyal and Sonal
    Abstract: Wireless Sensor Networks (WSNs) consists of widespread random deployment of energy constrained sensor nodes. Sensor nodes have different ability to sense and send sensed data to Base Station (BS) or Sink. Sensing as well as transmitting data towards sink requires large amount of energy. In WSNs, conserve energy & prolonging the lifetime of network are great challenges. Many routing protocols have been proposed in order to achieve energy efficiency in heterogeneous environment. This paper focuses on clustering based routing technique: Enhanced Distributed Energy Efficient Clustering Scheme (EDEEC). EDEEC mainly consists of three types of nodes in extending the lifetime & stability of network. Hence, it increases the heterogeneity and energy level of the network. Simulation results show that EDEEC performs better than DDEEC & DEEC.
    Click Here to Download Pdf
    Title: Leukemia Detection in the White Blood Cell Count using Sift Technique and Classification
    Authors: Jasleen Kaur and Miss Amrinder Kaur
    Abstract: Leukemia Detection is planned of automatic advance. A physical technique of LEUKEMIA DETECTION, specialist checks mini images. Leukemia detection is produces in the bone marrow. Basically Leukemia is detected only by investigating the white blood cells. Attentive only on WBC, Leukemia Detection system analyses the microscopic image and overcome these problems. It removes the necessary parts of images and direct applies some techniques. K-mean collecting is used only WBC (WHITE BLOOD CELL) detection. In this thesis we describe a system for medical data processing that mainly uses Sift (scale invariant feature transformation), best solution identifies and classification has been done (Feed Forward Neural Network). The noise is removed from the image using filter. Identify the white blood cell. Feature Extraction using SIFT is the processing of converting the image into data. Input image the BPNN classifier that test image into either infected or not infection. The major part of this work is to segment the white blood cells for leukemia detection. An evaluate the performance parameters like false acceptance rate, false rejection rate, accuracy and compare the white blood cell in manual and auto count.
    Click Here to Download Pdf
    Title: Survey on Clustering Techniques in Data Mining
    Authors: V.Maheswari
    Abstract: The process of data mining is to extract information from a large data set. Explorative data mining is the main task of clustering. Clustering is the effort of grouping a set of objects. There are different types of clusters: well-separated cluster, density-based cluster, etc.., and method like Hierarchical algorithm, partitioning, grid and density based algorithms it helps in the formation of clustering. In this paper a review about various types of clusters, techniques, algorithm is done.
    Click Here to Download Pdf
    Title: A Review On Data Mining Techniques To Classify Soil Data
    Authors: R.Preetha and Prof Mr. B. Murugesakumar
    Abstract: Indian economy is depending on agriculture. Agriculture is the main root of income for the most of the population. Soil fertility is important to the food production in agriculture. Data mining play a vital role for decision making on several issues related to agriculture field. The productive capacity of soil depends on soil fertility. Achieving maintaining appropriate levels of soil fertility is utmost importance if agriculture land is to remain capable of nourishing crop production. Here we used Data mining technique to analyse the soil fertility. To classify soil we use various technologies like USDA soil Taxonomy, FAO/UNESCO system, INRA/ French system. Use of information technology in agriculture can change the element of decision making and farmers can yield in a better way. It is centring on classification methods to using various algorithms to analyze the soil data set. This paper provides a survey of various classification techniques in soil.
    Click Here to Download Pdf
    Title: Survey on Data mining Techniques used in Kidney related Diseases
    Authors: N. Pavithra and Dr.R. Shanmugavadivu
    Abstract: Nowadays Kidney Diseases is a major problem in medical mining. Mainly the healthcare industry are used the data mining techniques for predicting the kidney disease from the data set. So the different DM techniques used to identify the accuracy for the kidney related diseases.
    Click Here to Download Pdf
    Title: Comparative Analysis of DCT and DWT techniques
    Authors: Rupam and Sudesh Nanda
    Abstract: In the last years, we have been viewing a transformation in the communication technologies. Image compression is one of the enabling technologies in this multimedia revolution. Due to this communication become easy as it represents the data in compact form. There are different techniques which are used for the image compression; here we are implementing DCT and DWT techniques for image compression. Comparative analysis of the image compression using DCT and DWT are done with the help of various performance criteria like the compression ratio, MSE and PSNR.
    Click Here to Download Pdf
    Title: Automated Count of Leukemia Cancer cells of Micro­scopic images with ACO algorithm and BPNN classifier
    Authors: Jeenu Garg and Ms. Daljit Kaur
    Abstract: Leukemia Detection is planned of automatic advance. A physical technique of LEUKAEMIA DETECTION, specialist checks mini images. Leukemia detection is produces in the bone marrow. Each bone contains a thin material inside is recognized bone marrow, the mechanisms of erythrocytes and leucocytes and platelets. Basically Leukaemia is detected only by investigating the white blood cells. Attentive only on WBC, Leukaemia Detection system analyses the microscopic image and overcome these problems. It removes the necessary parts of images and direct applies some techniques. K-mean collecting is used only WBC (WHITE BLOOD CELL) detection. In this paper it describes a system for medical data processing that mainly uses Sift (scale invariant feature transformation) and classification has been done (Back Propagation Neural Network). Its provides the consequences achieved by processing dissimilar data including databases of children with learning system damage. Blood images of the good pixel quality are obtained. The noise is removed from the image using filter. Identify the white blood cell. Solidity need to be measured for clear image. Roundness checks whether the shape is circular or not .To eliminate the features from the dispensation picture. Basically find out the features of the nucleus of myelocytes and lymphocytes. Feature Extraction using PCA is the processing of converting the image into data. Input image the BPNN classifier that test image into either infected or not infection. Here ALL-IDB proposed by Donida Labati. The major part of this work is to segment the white blood cells for leukaemia detection. An evaluate the performance parameters like false acceptance rate, false rejection rate, accuracy and compare the white blood cell in manual and auto count.
    Click Here to Download Pdf
    Title: Multi-keyword Context-Oriented diversification search on XML Data using Map-Reduce Framework
    Authors: Sneha B. Mandlik and Prof. V. R.Sonawane
    Abstract: During searching process user enter particular candidate searching keyword and with use of searching algorithm respective searching query is executed on target dataset and result is be entered by user to get appropriate result set. In case of confusing group of keywords or ambiguity in it or short and similar in it causes an irrelevant searching result as output. This searching algorithms works on exact result fetching which can be irrelevant in case when there is problem in input query given and keyword. This problem statement is focused in this system. By considering the keyword and its corresponding relevant context in XML data, searching should be done by using automatically diversification process of XML keyword search. In this way system can satisfy user, as user gets the analytical result set based on context of user searching keywords. For more efficiency and to deal with big data, HADOOP platform is used.
    Click Here to Download Pdf
    Title: Design a System to identify the Emotion Using MFCC Feature Extraction and Classification using BPNN
    Authors: Poonam Kamboj and Er. Amarinder Kaur
    Abstract: In human mechanism interface request, emotion recognition from the speech signal has been research topic since numerous years. To recognize the emotions from the speech signal, many systems have been developed. Humans have the normal capability to use all their available senses for maximum awareness of the received message. Through all the available sanities people really intellect the emotional state of their communication partner. The emotional detection is natural for individuals but it is very problematic task for machine. Therefore the purpose of emotion recognition system is to use emotion related information in such a way that human engine message will be improved. Speech emotion recognition is unknown but the pattern gratitude system. This shows that the stages that are present in the pattern recognition system are also current in the Speech emotion recognition system. The speech emotion recognition system comprises five main machineries emotional speech input, feature extraction, feature selection, classification, and recognized emotional output. In this paper speech emotion recognition based on the earlier technologies which uses different classifiers for the emotion appreciation is reviewed. The classifiers are used to discriminate emotions such as anger, happiness, sadness etc. The database for the speech emotion acknowledgment organization is the emotional speech samples and the features extracted from these speech examples are the vigour, pitch, Mel frequency cepstrum coefficient .The optimize performance is based on extracted features(MFCC). The organization i.e. Back Propagation Neural Network performance is based on reduction (ACO). Inference about the presentation and restriction of speech emotion recognition system based on the dissimilar classifiers are also conversed.
    Click Here to Download Pdf
    Title: NoGoDi - A Normalized Google Distance (NGD) Based Approach for Text Document Clustering
    Authors: A.George Louis Raja, F.Sagayaraj Francis and P.Sugumar
    Abstract: In this paper we propose NoGoDi - a text document clustering method, which is based on the conjecture that two text documents are relatively similar (and can be placed in the same cluster) if they tend to utilize semantically related words in their contexts. More they are inclined to use a common set of vocabulary, more is the chance that they are closer to each other in their context. This paper is motivated by this tendency of the documents, which can be exploited and experimented to prune them into clusters. We estimate the semantic similarity of the documents through the application of the Normalized Google Distance (NGD), which calculates the semantic similarity between two words. The documents are made to undergo the usual clustering stages such as tokenization, removal of stop words and trimming, after which a Semantic Similarity Matrix is generated through NGD measure. This matrix is then analyzed to estimate the similarity of the documents, which further iteratively can cluster the Documents.
    Click Here to Download Pdf
    Title: MOOCs and the Need for POOCs
    Authors: J.Saul Nicholas and F.Sagayaraj Francis
    Abstract: Massive Open Online Courses MOOCs) have been in the center of attention in the recent years. This paper presents the features of MOOCs and its limitations. However, the main problem of all online learning environments is their lack of personalization according to the learners’ knowledge, learning styles and other learning preferences. This paper also presents the need for personalized MOOCs as well as the parameters that can be used to measure the effectiveness of MOOCS and its characteristics.
    Click Here to Download Pdf
    Title: Brain Tumor Segmentation Techniques: A Survey
    Authors: Navpreet kaur and Mr. Manvinder Sharma
    Abstract: The process of splitting an image into multiple parts is named as segmentation process. Segmentation of the image can be examined as the vital and deciding procedure for facilitating the imagination, presentation and characterization of areas of attraction in any medical image. In medical field research is done to a greater extent but still, segmentation is a challenging problem because of image noise, different image structures, various image content and many other factors. There is a need for efficient and fast algorithms. The accuracy, processing time and performance must be improved in image segmentation. In this paper, various brain tumor extraction techniques are reviewed.
    Click Here to Download Pdf
    Title: File Level Hybrid De-duplication Hashing Technique for Optimize Cloud Storage
    Authors: Manpreet Kaur and Daljit Kaur
    Abstract: With the constant growth of data, it has become a key concern to reduce a storage space that data occupies and the bandwidth consumption during the data transfer. Experiments show that large amount of redundant data exists in the storage system. So, in order to explain this problem the data de-duplication technology has been used. In this paper, we illustrate that data de-duplication using hybrid algorithm (MD5+SHA3) to provide more accurate results. We have implemented this algorithm using hashing technique for the purpose of data de-duplication. By implementing this algorithm, one can save more storage space, higher accuracy and less time consumption.
    Click Here to Download Pdf
    Title: Load Distribution in Email Traffic
    Authors: Sandipan Basu
    Abstract: Network congestion has been a significant topic in the study of computer networking. Network servers have to process innumerable requests per hour/day. The issue of concern is how to handle this huge amount of traffic to maintain QoS (Quality of service). This paper proposes an approach to deal with it.
    Click Here to Download Pdf
    Title: Review on Enhanced Software Defect prediction Methods
    Authors: Shelly Bhalla and Jasvir Kaur
    Abstract: Several indicators should be considered to estimate the software cost and effort. One of the most indicators is the size of the project. The estimation of effort and cost depends on the accurate prediction of the size. Generally, the effort and cost estimations are difficult in the software projects but most important parameter it’s volume and haste leads parameter. In this we have review different data mining technique in software default prediction by different features.
    Click Here to Download Pdf
    Title: Safe Data Discovery and Dissemination in Mobile Wireless Sensor Network
    Authors: Jasmine Kaur and Mukesh Sharma
    Abstract: Currently there is no standard for MWSNs, so often protocols from MANETs are borrowed. MANET protocols are preferred as they are able to work in mobile environments, whereas WSN protocols often aren't suitable. WSN routing protocols provide the required functionality but cannot handle the high frequency of topology changes as well as data discovery and dissemination in mobile wireless sensor network. So WSN protocols are not effective. So position based routing protocol like AODV, DSR, AOMDV and DSDV can be used. After checking them on Network simulator NS2 for Packet Delivery Ratio, Packet drop ratio, routing overhead, network throughput, total received packet, command packet it is found that DSR protocol best suited for secure Data dissemination and discovery.
    Click Here to Download Pdf
    Title: A proposal of Improved K-means algorithm for detecting gratuitous Email Spamming
    Authors: Garima Jain and Parveen Kakkar
    Abstract: Email’s the main source of communication nowadays. We humans on the earth use email for their personal and professional use. Email is approachable, easy, effective, faster and cheaper way of communication. The usage of email’s are growing day by day. That provides us a way to easily transfer information globally with the help of internet. Due to this reason email spamming is increasing. According to the investigation, it is reported that a user receives more spam than ham mails. Spam is gratuitous junk message which is used for spreading virus, malicious code to gain profit on zero cost. The Spam is a major problem that attacks electronic mails. So, it’s very important to distinguish harm from spam mails, many methods are proposed for classification and clustering techniques. This paper agminate on performance based on the evaluation of correct and incorrect instances of data classification using K-means and Improved K-means-means is a numerical, unsupervised, non-deterministic, iterative method. It is simple and very fast, so in many practical applications, the method is proved to be a very effective way that can produce good clustering results. The paper light outs to make comparative analysis of clustering algorithm K-means and Improved K-means in the context of emails based dataset .The experiment results in this paper are about using improved K-means and reducing the clustering time, increasing accuracy of correctly identified instances. The results shows the better performance of proposed algorithm using the j48 classifier and shows improved K-means is better than K-means
    Click Here to Download Pdf

    QR Code

    IJMCS