Collaborating Authors

The PS5's dashboard will have 'a whole new visual language'


The PlayStation 5 was finally unveiled last week and whatever you make of the design, we can probably agree that it's something of a departure from the console's previous iterations. Now, PlayStation's head of UX design has said we can expect the same for its user interface. A new interface was always on the cards -- something that was confirmed last week after we were given a quick demonstration of the start-up screen. But in a LinkedIn thread, Matt MacLaurin said the team has created a "100 percent overhaul of the PS4 UI and some very different new concepts." He added that the PS5 OS is "more subtle than flashy, but no pixel is untouched," and that as a UI "it's practical first, but it's a whole new visual language and a complete rearchitecting of the user interface."


After three years of research with the aim of popularizing interfaces with natural language, we now offer We wish to ensure greater freedom of editing for developers who want to create their language control interfaces.

Conversation: The New User Interface - Chatamo


While on the face of it, creating these chatbot capable of conversation seems straight forward, they do in actual fact need a lot of consideration. Its all too easy to confuse, annoy (or bore the customer) the user.

A Natural Language User Interface is just a User Interface


Let's say you're writing an application, and you want to give it a conversational interface: your users will type some command, and your application will do something in response, possibly after asking for clarification. There are lots of terms associated with this technology -- conversational commerce, bots, AI agents, etc. I think it's much clearer to call it a Linguistic User Interface (LUI), by analogy with the Graphical User Interface (GUI) you could attach to the same application. Imagining your application with a GUI is a good antidote to potentially woolly thinking about "AI agents". You still need to wire the UI to the underlying application, and the conceptual model of your underlying application is still going to play a dominant role in the overall user experience.

The UX of Voice: The Invisible Interface


It's a brand new year, and by most reliable indicators – the latest demos at CES 2017, the buzz on all the tech blogs and even the pre-roll ads interrupting my binge watching of Crazy Ex-Girlfriend – it looks like 2017 will be the year that voice interaction reaches mainstream adoption. Voice interaction – the ability to speak to your devices, and have them understand and act upon whatever you're asking them – was everywhere this year. Device manufacturers of all shapes and sizes heavily integrated voice capabilities into their offerings at CES 2017, with Amazon's Alexa stealing the show as their AI platform of choice. The rapid proliferation of voice interaction capabilities in our individual digital ecosystems raises critical questions for any designer whose work plays a role in the customer experience. It's becoming clear that voice interaction will soon become an expected offering as either an alternative, or even a full replacement to, traditional visual interfaces.