Goto

Collaborating Authors

 Chechulin, Andrey


Detecting AutoEncoder is Enough to Catch LDM Generated Images

arXiv.org Artificial Intelligence

In recent years, diffusion models have become one of the main methods for generating images. However, detecting images generated by these models remains a challenging task. This paper proposes a novel method for detecting images generated by Latent Diffusion Models (LDM) by identifying artifacts introduced by their autoencoders. By training a detector to distinguish between real images and those reconstructed by the LDM autoencoder, the method enables detection of generated images without directly training on them. The novelty of this research lies in the fact that, unlike similar approaches, this method does not require training on synthesized data, significantly reducing computational costs and enhancing generalization ability. Experimental results show high detection accuracy with minimal false positives, making this approach a promising tool for combating fake images.


Towards Blockchain-based Multi-Agent Robotic Systems: Analysis, Classification and Applications

arXiv.org Artificial Intelligence

This is known as cloud computing, distributed planning and management, and the classical Blockchain Trilemma - when it comes to the distributed ledgers provides and optimistic outlook towards choice two of the three between decentralization, scalability increasingly popular technological solutions such as the Internet and security [12]. One of the scaling methods that does not of Robotic Things (IoRT) [1], [2], [3], [4], [5] and the compromise security or decentralization is called sharding, Blockchain-based Multi-Agent Robotic Systems (MARS) [6], which involves fragmentation of the available dataset into [7], [8], [9]. It is known that one of the important problems smaller datasets called shards [11], [12]. Although multi-agent in developing multi-robot systems is the design of strategies robotic systems (MARS) are not so critical to scalability and for their coordination in such a way that the robots could speed as the financial and big data-based systems, they are effectively perform their operations and reasonably coordinate nevertheless also very sensitive to delays and throughput of the task allocation among themselves [10]. Real-world scenarios the information channels at data exchange between agents.