In this talk I introduce Digital Mental Models (DMMs) as a novel cognitive capability of AI-powered and cognition-enabled robots. By combining digital twin technology with symbolic knowledge representation and embodying this combination into robots, we tackle the challenge of converting vague task requests into specific robot actions, that is robot motions that cause desired physical effects and avoid unwanted side effects. This breakthrough enables robots to perform everyday manipulation tasks with an unprecedented level of context-sensitivity, foresight, generality, and transferability. DMMs narrow the cognitive divide currently existing in robotics by equipping robots with a profound understanding of the physical world and how it works.