Volume 4 Issue 4- April 2014

 

S.No Title Page
1. An Impact of Implementing Various Cryptographic Techniques Efficiently in a Public Centric Cloud
Thamil kumaran V.C , Chithra Mol C.R , & Sai Prasath
Abstract

The main issue we consider in this paper is providing security to the private data in public centric cloud storage. It is very important to provide security to our own data in a public storage like cloud. Here we address various cryptographic techniques which produce higher order and efficient data security in cloud. Here we survey various  architecture that  provide Some core traditional mechanisms for addressing privacy are no longer flexible, so new approaches need to be developed to address security issue. In this chapter we assess how security, trust and privacy issues occur in the context of cloud computing and discuss ways in which they may be addressed.

83-86
Full Text PDF
2. Hamming Code Detecting and Correcting Errors in Credit Cards Using Neural Networks
Dr R.Dhanapal , Gayathiri.P
Abstract

In this Paper we Discuss about Hamming Code Detecting and Correcting errors in Credit Card Information such as Card Number,Expiry Date and CVV Number .card  can store the card holder information  in chip .chip transmit the card  holders information in binary form.the detection and correction of error two types ,the type detects the errors during the transmission and the second type corrects the errors during the transmission.one common type of error detecting code is the  parity check code used in credit card. Hamming developed technique for detecting and correcting single bit errors in transmitted data. The technique requires that three parity bits (or check bits) be transmitted with every four data bits. The algorithm is called a (7,4) code, because it requires seven bits to encoded four bits of data. implementation of Hamming codes.Parity bit is used to detect the errors in codeword.Parity bit check whether the error occur in the card holder information or not and how to correct the error in the bit and it can implemented in Hamming Code Genarator Matrix  using Neural Networks.
Keywords: Hamming Code,Neural Networks,Credit Card,Genarator Matrix,Parity Bit.

87-91
Full Text PDF
3. Best Mechanism for Increasing the Data Method for Controlling and Maintaining Efficient Technique in MANET
S.Karthick, D.Adhimugasivasakthi, M.Ganesan, A.Laxmareddy, P.Vetrivel
Abstract

A high performance and low power architecture is devised for a 8Mbps infrared wireless communication system dedicated to the mobile ad hoc network in this architectured,8PPM(4-Pulse Position Modulations)infrared is signals detected by an infrared receiver are digitized pulses are demodulated by a 2-bit digital demodulator. To improve the dynamic range of link length, 8PPM demodulator is synthesized to implement a demodulation algorithm which is constructed so as to accommodate the output tolerance of the infrared receiver. A Part of source experimental result show that the realized 8Mbps infrared communication system can achieve an error free link in the range of 0-280cm at 180mW power consumption DCIM is a pull-base algorithms that implement adaptive timing to live (TTL),The piggybacking, and perfected, and provide near strength consistency capability. Cache data item are assigned to adaptive TTL value that corresponding to their updating rates at the data sources, where items with expiring TTL value are grouped in validation request to the data sources to refresh them, whereas unexpired that 1s but with long request rates are prefetching from the direct server. In this paper, DCIM is analysing to assess their delay and bandwidth gains or costs when comparing to polling any time and push-based scheme.

92-96
Full Text PDF
4. Theoretical Study of Ambient Noise Cancellation for Mobile Phones
S.Bharathiraja and G.Sumalatha
Abstract

All of us have experienced trying to make a mobile phone call from a noisy street, crowded restaurant or train station where the background noise can make it impossible to hear the incoming call.  It can be worse when the person next to you in these situations is yelling into the receiver in an attempt to be heard.  Active and passive noise cancelling technologies can minimize background noise in high end headphones; however these technologies today can not provide the same benefits in mobile handsets. Clearly mobile handsets could benefit from noise cancellation. Active noise cancellation (ANC) has been used in headphones for several years, often used to cut out the drone of an aircraft on a long journey. But the full benefit of the technology has not been realized – the potential is for noise cancellation to be used in a much wider range of applications creating a field of quiet for users of a range of consumer equipment.  A highly effective noise cancelling scheme would not only benefit consumers by helping to save our hearing, but carriers would benefit from longer calls, more calls and more satisfied consumers resulting in higher revenues.

97-99
Full Text PDF
5. Performance Analysis of TCP Congestion Control Algorithms
Priyanka K. Shinde, Prof. Nitin R. Chopde
Abstract

The demand for fast transfer of larger volume of data, and the deployment of the network infrastructures is ever increasing. However, TCP is the dominant transport protocol of today, does not meet this demand because it favors reliability over timeliness and fails to fully utilize the network capacity due to its limitations of its conservative congestion control algorithm. The slow response of TCP in fast long distance networks leaves sizeable unused bandwidth in such networks. A large variety of TCP variants have been proposed to improve the connection’s throughput by adopting more aggressive congestion control algorithms. Some of the flavors of TCP congestion control are loss-based, high-speed TCP congestion control algorithms that uses packet losses as an indication of congestion; delay-based TCP congestion control that emphasizes packet delay rather than packet loss as a signal to determine the rate at which to send packets. Some efforts combine the features of loss-based and delay-based algorithms to achieve fair bandwidth allocation and fairness among flows. A comparative analysis between different flavors of TCP congestion control namely Standard TCP congestion control (TCP Reno), loss-based TCP congestion control (HighSpeed TCP, Scalable TCP, CUBIC TCP), delay-based TCP congestion control (TCP Vegas) and mixed loss-delay based TCP congestion control (Compound TCP) is presented here in the paper.

Key words Congestion control, High-speed networks, TCP.
100-102
Full Text PDF
6. Gathering Web Information for Personalized Ontology Model
Nandini P.Wasnik , Prof.Ms.Mahip M.Bartere
Abstract

In a personalized web information gathering, for the  knowledge description ontology term is use. Mainly  ontology  used  for  acquires  knowledge,  share,  reuse and  increase  relations  description of  knowledge. Paper  shows  different  problems and searching  techniques  also  related  work  shows  working  of  different  authors  on ontology. Main  work  of  ontology  is  to  gather  web  information  based  on keywords  that may be local repository or global repository.  Initialization of information gathering is beginning according to user   profile. Also section covers basic architecture of ontology   which   focuses   on   overall   information gathering. Learning   concept   extract   the   information   in   structured   format for unstructured input. Ontology as model for knowledge  description and validation is  used to  represent  user  profile  in  personalized  web gathering  information. While presenting  user  profiles  most of the   models  used  a  global  knowledge  bases  or  user  local  information  for  representing  user  profiles.  Ontology  is  the  model  for  knowledge  description  and validation, which  are largely  used  to  represent  user  profile  in  personalized  web  information  gathering. When   representing   user  profiles,  most of the  models  have  access only  knowledge  from either  a  global  knowledge  base  or  user  local  information. This paper include,  a personalized  ontology  model  is  proposed  for  knowledge  representation  and reasoning  over  user  profiles. It  will  contain  user  profiles  from  both  world knowledge  base  and  user  local  instance  repository. The ontology model is evaluated   against   benchmark  models   in   web   information gathering. The concept models of the user profile represent by user when gathering web information. A concept  model  is  possessed  by  users  and  is  generated  from  there  background knowledge.  This  concept  model   cannot  be  proven  in laboratories; many  web ontologists  have  observed  it  in  a  user  behaviour  the  results  show  that  this ontology  model  is  successful.
Keywords— Ontology, personalization, semantic relation, world knowledge, local instance repository, user profile,web information gathering.

103-107
Full Text PDF
7. Face Detection Based Color Image CAPTCHA Generation by using
FaceDCAPTCHA Algorithm: A Novel Approach

Vaishnavi J.Deshmukh,Sangram S.Dandge
Abstract

 

108-113
Full Text PDF
8. Data Center Transmission Control Protocol an Efficient Packet Transport for the Commoditized Data Center
Madhavi Gulhane, Dr.Sunil R.Gupta
Abstract

Cloud data centers host diverse applications, mixing in the same network a plethora of workflows that require small predictable latency with others requiring large sustained throughput. In this environment, today’s state-of-the-art TCP protocol falls short. We present measurements of a 6000 server production cluster and reveal network impairments, such as queue buildup, buffer pressure, and incast, that lead to high application latencies. Using these insights, propose a variant of TCP, DCTCP, for data center networks. DCTCP leverages Explicit Congestion Notification (ECN) and a simple multibit feedback mechanism at the host. We evaluate DCTCP at 1 and 10Gbps speeds, through benchmark experiments and analysis. In the data center, operating with commodity,shallow buffered switches, DCTCP delivers the same or better throughput than TCP, while using 90% less buffer space. Unlike TCP, it also provides hight burst tolerance and low latency for short flows. While TCP’s limitations cause our developers to restrict the traffic they send today, using DCTCP enables the applications to handle 10X the current background traffic, without impacting foreground traffic. Fur-ther, a 10X increase in foreground traffic does not cause any timeouts, thus largely eliminating incast problems.

114-120
Full Text PDF
9. Generic Algorithm for Image Tampering Detection Based on Claimant Suspect Decision Rule
Deepali N. Pande, A.R. Bhagat Patil, Antara S. Bhattacharya
Abstract

This paper presents an innovative algorithm for image tampering detection based on forgery suspect generated by the claimant. The scheme has good scope for third party based authentication of raw images. The image at the sender side is shielded with security parameters generated from cumulative visual word from unique color features of the image. The recipient checks for the match with secret parameters shared commonly. A mismatch helps in generation of suspicion parameters which serves as a testament in generation of bag of features. The Euclidean distance is used as a metric in localization of tampered regions. The scheme successfully localizes copy-paste attack, image splicing and also transformation based attacks. The experimental results show accurate results in localization of tampering.

Keywords-image security, active tampering detection, passive tampering detection, image forgery detection

121-124
Full Text PDF
10. Private Content Based Multimedia Information Retrieval Using Map-Reduce
Swapnil P. Dravyakar, Sunil B. Mane, Dr. Pradeep K. Sinha
Abstract

Today's large amount of variety multimedia information like audio, video and images are massively produced through mic, digital camera, mobile phones and photo editing software etc. In these paper we proposed a solution for a large database of images which provides secure, efficient and effective search and retrieve the similar images of Query image from the database with the help of a novel technique of Local Tetra Pattern (LTrPs) for Content Based Image Retrieval (CBIR) which carries the interrelationship in between the center pixels and its surrounded neighbours of center pixel by computing difference of gray level.  The images which are stored on the database are private or contain private information of particular user and these digital images should not be accessed by others except the user. The user wish for a privacy and security for their images while storing and retrieving the images  from the database. For that the private content based image retrieval (PCBIR) used for accessing the similar images from the database of query image even without learned to database admin.
 This system incorporates Map-Reduce technique for privately search and retrieve of images over large datasets of user images. Another part of the system is large storage i.e. Hadoop distributed file system (HDFS), Hadoop specify a scheme that processes on large datasets in a distributed cluster of computer by using a simple programming models. Proposed solution will be a system to upload, search and retrieve the query image among the database of images. 

Keywords - Map-Reduce, CBIR, Image Retrievals, Local Tetra Patterns (LTrPs), Hadoop, Cloud Computing.

125-128
Full Text PDF
11. A Survey Based on Fingerprint, Face and Iris Biometric Recognition System, Image Quality Assessment and Fake Biometric
Pradnya M. Shende,Dr.Milind V. Sarode, Mangesh M. Ghonge
Abstract

A biometric system is a computer system .Which is used to identify the person on there behavioral and physiological characteristic (for example fingerprint, face, iris, key-stroke, signature, voice, etc). A typical biometric system consists of sensing, feature extraction, and matching modules. But now a day’s biometric systems are attacked by using fake biometrics. This paper introduce three biometric techniques which are face recognition, fingerprint recognition, and iris recognition (Multi Biometric System) and also introduce the attacks on that system and by using Image Quality Assessment For Liveness Detection how to protect the system from fake biometrics. How the multi biometric system is secure than uni-biomertic system.
Keyword: - Image quality assessment,  biometrics, security, Attacks.

129-132
Full Text PDF
12. Study of Different Brain Tumor MRI Image Segmentation Techniques
Ruchi D. Deshmukh,Chaya Jadhav
Abstract

The method of brain tumor segmentation is nothing but the differentiation of different tumor area from Magnetic Resonance (MR) images. There are number of methods already presented for segmentation of brain tumor efficiently. However it’s still critical to identify the brain tumor from MR images. The segmentation process is extraction of different tumor tissues such as active, tumor, necrosis, and edema from the normal brain tissues such as white matter (WM), gray matter (GM), as well cerebrospinal fluid (CSF).  As per the survey study, the brain tumors most of time detected easily from brain MR image, but required level of accuracy, reproducible segmentation, abnormalities classification is not predictable and straightforward.  The segmentation of brain tumor is composed of many stages. The manual process of doing the segmentation of brain MR images is very time consumption and tedious task, and hence it is associated with many challenges. Therefore, we need automated segmentation method for brain images. There are many techniques presented to investigate the performance of automated computerized brain tumor detection for the medical analysis purpose. In this review paper, our main goal is to present the review of different brain tumor segmentation methods using the MR images. The different methods for segmentation are studied with their advantages and disadvantages in this paper. 

Index Terms-                        Brain Tumor, Classification, Disease Identification, Magnetic Resonance Imaging (MRI), Segmentation, Tumor Detection.

133-136
Full Text PDF
13. Secure SMS System for E-Commerce Applications
Mahmood Khalel Ibrahim, Wasan Zaki Ameen
Abstract

In this paper, a simple e-commerce application for shopping is presented with implementing a new version of Diffie-Hellman protocol, to insure mutual authentication between client and server along with exchanging key securely through unsecure channel.
The proposed system provides the major security services which are Authenticity, Integrity and Confidentiality, by implementing Zero Knowledge Proof (ZKP) for authenticity, keyed massage authentication code algorithm (HMAC-SHA1) for integrity and Advanced Encryption Standard (AES) for Confidentiality.

Keywords—Authenticity, Integrity, Confidentiality, Zero Knowledge Protocol, Diffie-Hellman, AES, HMAC.

137-142
Full Text PDF
     

 

 

 

 

IJCSET Menu

Downloads