Vol.58, No.1, 2019, pp.1-13, doi:10.32604/cmc.2019.02171
OPEN ACCESS
ARTICLE
High Capacity Data Hiding in Encrypted Image Based on Compressive Sensing for Nonequivalent Resources
  • Di Xiao1,*, Jia Liang1, Qingqing Ma1, Yanping Xiang1, Yushu Zhang2
College of Computer Science, Chongqing University, Chongqing, 400044, China.
School of Information Technology, Deakin University, Victoria 3125, Australia.
* Corresponding Author: Di Xiao. Email: xiaodi_cqu@hotmail.com.
Abstract
To fulfill the requirements of data security in environments with nonequivalent resources, a high capacity data hiding scheme in encrypted image based on compressive sensing (CS) is proposed by fully utilizing the adaptability of CS to nonequivalent resources. The original image is divided into two parts: one part is encrypted with traditional stream cipher; the other part is turned to the prediction error and then encrypted based on CS to vacate room simultaneously. The collected non-image data is firstly encrypted with simple stream cipher. For data security management, the encrypted non-image data is then embedded into the encrypted image, and the scrambling operation is used to further improve security. Finally, the original image and non-image data can be separably recovered and extracted according to the request from the valid users with different access rights. Experimental results demonstrate that the proposed scheme outperforms other data hiding methods based on CS, and is more suitable for nonequivalent resources.
Keywords
Compressive sensing, encrypted image, data hiding, prediction error, nonequivalent resources.
Cite This Article
Xiao, D., Liang, J., Ma, Q., Xiang, Y., Zhang, Y. (2019). High Capacity Data Hiding in Encrypted Image Based on Compressive Sensing for Nonequivalent Resources. CMC-Computers, Materials & Continua, 58(1), 1–13.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.