Optimizing cnn-Bigru performance: Mish activation and comparative analysis with Relu