Title: Analysis of Network Performance in Heterogeneous Network over Different AQMS
Authors: Shallu Bedi and Mr. Gagangeet S. Aujla
Abstract: A MANET is a single network including Internet, may be connected to a larger network. All nodes free of every other node can communicate with each other and in this type of network nodes are independent. An example of a P2P network and multi - hop network are connected. The heterogeneous nature of the networks can sometimes create problem in the traffic management on the backbone links. The multimedia traffic and graphics mostly flows on UDP and traffic like HTTPS and link managementís flows on TCP. In order to analyze the traffic management, we must understand the behavior of various Active Queue Management Techniques on different traffic classes via TCP and UDP. In this thesis, a multi class network is analyzed for both TCP and UDP traffic classes for various network performance parameters like Throughput, Packet loss ratio and Average end to end delay for various networks Active Queue Management Techniques. In the thesis four Active Management Techniques are tested for both TCP and UDP traffic classes. Most of the fast multimedia and graphics traffic is based on UDP. Therefore analysis of this type of traffic class of supremely important. In this paper, a concise result analysis of detail findings is represented of Drop Tail, SFQ, RED, and REM under varying network conditions. The throughput of the network in calculated by varying the network conditions like bandwidth, delay, channel error rate. In case of TCP it has been observed SFQ was intended to perform best in as it employs Fair Queuing Algorithms for the handling of flow of packets on link with simultaneous sessions. On the other hand in UDP Drop tail, RED and REM performed best in different scenarios as their Queue management involves only the link state and the congestion and not on the traffic flow.
Authors: Kanika Minhas, Er.Amanpreet Kaur and Dr. Dheerendera Singh
Abstract: Denial of Service attacks is frequently presenting an increasing threat to the global inter-networking infrastructure in networking area and scenario. The algorithm for TCP congestion control is highly efficient for the various networking areas and operations as well its internal assumption of end-system cooperation results are well prone to attack by high-rate flows. A Shrew attack uses the concept of a low-rate burst which is carefully designed to use the TCPís retransmission timeout mechanism in an unfair way and can affect the bandwidth of a TCP flow in a smooth manner without coming into appearance as an intruder. A Shrew attack has further classifications such as a low rate shrew attack or an high rate shrew attack. A high rated shrew attack uses the concept of timely sending high rate packet stream in low frequency. Such attack can affect the performance of a network to a large extent.
Title: Recognition of the image by using plant diseases ontology in image processing
Authors: Er. Nikita Rishi and Er. Jagbir Singh Gill
Abstract: Plant disease is condition caused by infectious organisms or environmental factor. Plant plays a very decisive role in the ecological balance and is a prerequisite to life. There are various environmental factors that have cause various plant diseases beside the diseases caused due to infection causing micro-organisms. Hence there is an urgency to determine the disease for the betterment of agriculture. In this paper, image segmentation, feature extraction, PCA (Principal Component Analysis) and adaptive K-means clustering have been examined for the identification of heterogeneous plant diseases.
Title: A Survey on Anomaly Detection Behaviour with Kernel Mapping and Inclusion
Authors: Dr R.Umagandhi and K.Gomathi
Abstract: Anomaly detection is a significant problem that has been researched within various research areas and application domains. Many anomaly detection methods have been particularly examined for certain application domains, as others are more standard. This survey papers is describes an anomaly detection technique for unsupervised data sets accurately reduce the data from a kernel Eigen space lacking performing a batch re-computation. For each anomaly behavior activities is to identify the key factors, which are used by the methods to differentiate between normal and abnormal actions. This survey paper provides a best and brief understanding of the techniques belonging to each anomaly and kernel mapping category. Further, for each grouping, to identify the improvements and drawbacks of the techniques in that category. It also provides a discussion on the computational complexity of the techniques since it is an important issue in real application domains hope that this survey will provide a good understanding of the many directions in which research has been done on this topic.
Title: Improved Routing Protocol Using Hybrid Ant Colony Optimization and Particle Swarm Optimization
Authors: Neelam Kumari and Arpinder Singh Sandhu
Abstract: The principle problem of QoS routing would be to setup a new multicast hierarchy that may meet specific QoS concern. In order to lessen the constraints of the earlier work a whole new improved approach is proposed with this exertion. Inside of proposed technique the challenge of multi-cast sapling is removed using clustering primarily based technique. For starters multi-radio along with multichannel primarily based clustering will be deployed along with these bunch head have the effect of the multicasting. It's going to diminish the general energy consumption of nodes along with complexity of intelligent algorithms. The way will end up being evaluated based upon the ish colony marketing. Thus they have produced better results in comparison with other strategies.
Title: A Comprehensive Study on MCC (Mobile Cloud Computing) Architecture, Challenges and Benefits
Authors: Dr. Mukesh Chandra Negi
Abstract: Cloud computing is a technique where based on cloud delivery model , your complete or partial IT infrastructure and applications are managed by some third party cloud service provider or you just have to subscribe for the services you wanted to use if itís completely managed and own by the provider. Itís helpful especially where you have a requirement of high efficient resources like memory, processor, CPU, storage etc with costing constraints. Seeing the growth of smart mobile devices in last few years, it has been realized to identified and develop a rich application platform for mobile devices to gain the benefits of cloud computing in mobile devices. Mobile Cloud Computing (MCC) concept has been developed realizing the same and itís a great concept and platform to leverage the benefits of cloud computing with mobile devices  . Itís a concept where some thin mobile client applications developed for the mobile devices and all backend processing workload offloaded to the clod environment, which execute the user requirements and send back results to the user mobile device thin application. In this paper I am going to explain basic and high level architecture, challenges and benefits of mobile cloud computing concept and technology.
Title: A Study of Code Smells in Software Versioning System Using Datamining Techniques
Authors: Dr.R.Beena and Dithy MD
Abstract: Code Smell is System used to identify the poor design and implementation choices which lead to deeper problem in the code repositories. Code smell symptom may hinders code comprehensions, many approaches has proposed to perform this action but they still might not be adequate for detecting many of the smells based on Principle constraints and quality Measure . The present study is discussed with the Detection System and prevention system models to detect the code smells based on Structural and historical information in the instance of the code smell techniques like divergent changes, shotgun surgery, parallel inheritance, blob and feature envy. Prevention technique has considered mandatory for improving code behaviour as agile development become popular among developer, hence prevention of the code smells has to be developed using refactoring technique. Experimental results has conducted and evaluated against competitive approaches by using data mining metrics like precision and recall.
Abstract: Today, human being canít think of surviving without internet as it becomes the inevitable part of their life where each and everything in this world is connected together with the help of internet. Connectivity is one of the fundamental challenges to achieve this, and for the same continuous efforts have been made and nowadays efforts have been made in the direction of sky or (towards the sky). As number of the users accessing internet keeps on increasing day by day, and for the same facebook makes an initiative in form of DRONES. This paper focuses on the mechanism which is a level ahead of the previous ones for providing internet services and the initiative taken by project handled by facebook and Internet.org in the form of ďFACEBOOK DRONESĒ. Aiming to provide internet service to the areas of the world where humans have no or little internet access. It is a technique of providing internet services through huge drones which has a wingspan of a Boeing 737 weighing less than a car will operate at the height of 60,000 to 90,000 feet , in the air ,and can stay airborne for three months while offering the internet speeds of 10 gigabits per second.
Title: TSM: TWU and Suffix-based High Utility Itemset Mining Algorithm
Authors: Sonam Panwar and Ajay Jangra
Abstract: Utility mining considers individual items different according to their respective utilities and mines items with high utilities. This mining task is commonly known as High Utility Itemset (HUI) mining.To do so, most available algorithms first generate large number of candidates among which many are eventually found to be low utility itemsets. Moreover, a large amount of time is consumed in join operations needed to generate larger itemsets from smaller ones. In this paper, we present a fast Hash-Map based algorithm named TSM that efficiently prunes low-utility itemsets using co-occurrence information. The overall impact is a considerable reduction in join operations and hence faster execution.
Title:A Review of Novel Hybrid Image Forgery Detection Model
Authors:Samiksha Singla and Harpreet Tiwana
Abstract: The image forgery is the term given to the copying of the image content after editing the digital image data in order to remove the similarity with the original image. The image forgery cause monetary losses to the graphic designer professionals or the digital media companies. Latter organizations run their business from the graphic designing etc, which is severely hurt by the image forgery. The image forgery detection is the branch of digital image processing to detect the image forgery in the image content for the purpose of copyright protection. In this paper, we are proposing the image forgery detection model for the images using the hybrid algorithm, which uses the combination of the SURF, FREAK, SVM and GREEDY algorithms. The proposed model is designed to work in the double layered model for the forgery detection and relies upon the greedy algorithm for the final result generation. The proposed model is expected to outperform the existing image forgery detection models. The proposed model results would be employed in the form of accuracy, precision and recall. The statistical analysis would be performed over the obtained results in order to analyze the performance of the proposed model..