FASTMORA

How we exposed gender biases at Davos

Task

The shortcoming of the AI economy is that since inception, computer programming has been done predominately by men.  This has resulted in systems worldwide embedded with gender and cultural biases, which organisations rely on for sensitive screening and decision making. Our task: Make a case and create demand for a new global consulting practice aimed at the C-Suite.

Pilot

FastMora used a triangulation approach by bringing stakeholders, AI gurus, and social scientists together in a sprint session. We built models and created future scenarios to design a real-time demonstration unveiled at the World Economic Forum in Davos. The experiment included crowdsourced AI frameworks built using the same gender and ethnic imbalances found in Silicon Valley. The internal and external discussions about the experiment gave way to a new initiative backed by our client’s leadership and endorsed by their portfolio of customers.

Transformation

The pilot gathered sufficient data and interest to greenlight a new category practice.  One year later, in 2018, our client unveiled a consulting offer that provides diagnostic tools for addressing these systematic AI biases and has been embedded into their main product.