Attention Consistency for LLMs Explanation