One of the biggest open problems in mathematics may be solved within the next decade, according to a poll of computer scientists. A solution to the so-called P versus NP problem is worth $1 million and could have a profound effect on computing, and perhaps even the entire world. The problem is a question about how long algorithms take to run and whether some hard mathematical problems are actually easy to solve. P and NP both represent groups of mathematical problems, but it isn't known if these groups are actually identical.

Hello, I have a problem and some variations on it that I would like to share. At the moment, I don't know solutions to all the problems, so any ideas/solutions are welcomed in the comments. Problem 0: You are given an array $$$A$$$ of $$$n$$$ integers, and $$$q$$$ queries of the form "$$$l$$$ $$$r$$$ $$$x$$$" and each number in $$$A[l...r]$$$ appear either once or twice. Problem 1: The same as Problem 0, but without the restriction that each number in $$$A[l...r]$$$ appear either once or twice. Problem 4: The same as Problem 1, but we have updates that change single elements (e.g.

The open-domain Frame Problem is the problem of determining what features of an open task environment need to be updated following an action. Here we prove that the open-domain Frame Problem is equivalent to the Halting Problem and is therefore undecidable. We discuss two other open-domain problems closely related to the Frame Problem, the system identification problem and the symbol-grounding problem, and show that they are similarly undecidable. We then reformulate the Frame Problem as a quantum decision problem, and show that it is undecidable by any finite quantum computer.

Data integrity is a property which a world state interpreted with a world model is consistent with the real operating environment. Even a formally verified safety claim of an autonomous system is prone to a malfunction caused by loss of data integrity. From a first-person viewpoint in a congested environment, some components of measurable part of the world state may become transiently deficient or unavailable because of the limited capability of sensor devices. If the system could get into a situation where the world state becomes suddenly unobservable, existing estimation methods may get unstable. These methods can hardly detect the loss of data integrity and produce an incorrect estimate without any notice. Our insight is that we can merge the original concept of observer theory with that of automated reasoning. Firstly, we propose a new way of unifying them into a problem of checking satisfiability of a formula that consists of predicates regarding the world model and decision variables regarding unmeasurable part of the world state. We can detect a loss of data integrity by checking if the problem is unsatisfiable. Secondly, we replace the idea of observability in control theory with identifiability with respect to a measure of tolerance and a world model. We show a procedure of estimating the world state with a bounded uncertainty specified by the measure of tolerance. Third, we show that a problem of sensor fusion, a problem of reasoning a world state of discrete and enumerated type, and a decision problem under uncertainty in the world state are formulated as an identifiability problem. The proposal presents a constructive basis for supporting the degree of confidence in the estimated world state.