Transmitting binary data across a network should generally avoid transmitting raw binary data over the medium for several reasons, one would be that the medium may be a textual one and may not accept or correctly handle raw bitstream, another would be that some protocols may misinterpret the meaning of the bits and causes a problem or even loss of the data. To make the data more readable and would avoid misinterpretation by different systems and environments, this paper introduces encoding two of the most broadly used data interchange formats, XML and JSON, into the Base64 which is an encoding scheme that converts binary data to an ASCII string format by using a radix-64 representation. This process, will, make the data more readable and would avoid misinterpretation by different systems and environments. The results reflect that encoding data in Base64 before the transmission will present many advantages including readability and integrity, it will also enable us to transmit binary data over textual mediums, 7 Bit protocols such as SMTP, and different network hardware without risking misinterpretation.
Nowadays, it is difficult to imagine a powerful algorithm of cryptography that can continue cryptanalyzing and attacking without the use of unconventional techniques. Although some of the substitution algorithms are old, such as Vigen`ere, Alberti, and Trithemius ciphers, they are considered powerful and cannot be broken. In this paper we produce the novelty algorithm, by using of biological computation as an unconventional search tool combined with an uninhibited analysis method is the vertical probabilistic model, that makes attacking and analyzing these ciphers possible and very easy to transform the problem from a complex to a linear one, which is a novelty achievement. The letters of the encoded message are processed in the form of segments of equal length, to report the available hardware components. Each letter codon represents a region of the memory strand, and the letters calculated for it are symbolized within the probabilistic model so that each pair has a triple encoding: the first is given as a memory strand encoding and the others are its complement in the sticker encoding; These encodings differ from one region to another. The solution space is calculated and then the parallel search process begins. Some memory complexities are excluded even though they are within the solution paths formed, because the natural language does not contain its sequences. The precision of the solution and the time consuming of access to it depend on the length of the processed text, and the precision of the solution is often inversely proportional to the speed of access to it. As an average of the time spent to reach the solution, a text with a length of 200 cipher characters needs approximately 15 minutes to give 98% of the correct components of the specific hardware. The aim of the paper is to transform OTP substitution analysis from a NP problem to a O(nm) problem, which makes it easier to find solutions to it easily with the available capabilities and to develop methods that are harnessed to attack difficult and powerful ciphers that differ in class and type from the OTP polyalphabetic substitution ciphers.
Experts and researchers in the field of information security have placed a high value on the security of image data in the last few years. They have presented several image encryption techniques that are more secure. To increase the security level of image encryption algorithms, this article offers an efficient diffusion approach for image encryption methods based on one- dimensional Logistic, three-dimensional Lorenz, DNA encoding and computing, and SHA-256. The encryption test demonstrates that the method has great security and reliability. This article, also, examines the security of encryption methods, such as secret key space analysis, key sensitivity test, histogram analysis, information entropy process, correlation examination, and differential attack. When the image encryption method described in this article is compared to several previous image encryption techniques, the encryption algorithm has higher information entropy and a lower correlation coefficient.
Currently, an approach involving a coder with a combined structure for compressing images combining several different coders, the system for connecting them to various bit planes, and the control system for these connections have not been studied. Thus, there is a need to develop a structure and study the effectiveness of a combined codec for compressing images of various types without loss in the spatial domain based on arithmetic and (Run-Length Encoding) RLE-coding algorithms. The essence of separate effective coding is to use independent coders of the same type or one coder connected to the planes alternately in order to compress the higher and lower bit planes of the image or their combinations. In this paper, the results of studying the effectiveness of using a combination of arithmetic and RLE coding for several types of images are presented. As a result of developing this structure, the effectiveness of combined coding for compressing the differences in the channels of hyperspectral images (HSI) has been established, as hyperspectral images consist of multi-spectral bands, instead of just the typical three bands (RGB) or (YCbCr) found in regular images. Where, each pixel in a hyperspectral image represents the entire spectrum of light reflected by the object or scene at that particular location.