Chetwoods Works team champions the digital delivery of our projects, and Dr.Erika Parn, a Research Associate at the University of Cambridge and lead strategic consultant to Works, has been looking at how our design practice and creative processes could be aided by the recently launched beta version of the DALL–E 2 platform by Open AI, the latest breakthrough in natural language and image processing using artificial intelligence.
The DALL–E 2 machine demonstrates the power of using AI to interpret and generate complex content including visualisations. It can visualise human written inputs in the English language, and create corresponding images based on written descriptions in any style such as a painting, crayon drawing, sketch, 3D-render or digital photograph.
As architects the potential of this technology to visualise a building design from a written description is certainly interesting. Based on written descriptions, 3D visualisations, sketches and photorealistic images can be automatically generated by DALL-E 2 within 10 seconds, and thousands of design variations can be subsequently created.
While this technology will never replace an architect, DALL-E 2 could be used to generate conceptual images of buildings that do not yet exist. This would be a valuable additional tool during the concept design phases of a project by offering high quality ‘hero image’ visuals which can quickly be adapted ‘live’ in response to discussions and conversations during the evolution of the design solution.
The technology has the potential to add an additional visual communication dimension to the existing stages in our creative process, as our cross-disciplinary design teams are researching and interpreting the detail of a brief, exploring design criteria and parameters, assessing the local environment geography and physical aspects of a site, and looking at how to maximise environmental features and user-wellbeing benefits.
How DALL–E 2 works
Both of the images above were generated by DALL-E 2. How exactly?
DALL-E 2 is using artificial neural networks, with over 12 billion parameters, to make sense of natural language inputs and convert them into visual outputs. DALL-E 2’s interface is a natural language input that generates corresponding images.
All six images below were generated within 10 seconds, using the text input: “3D render of a camouflaged cube-shaped house in a serene green forest” (see image below). Interestingly, the technology has interpreted the word ‘camouflaged’ in a slightly different way to how a designer might, it has embedded the building within the forest itself, rather than amending the facade colours or elements of the structure to match the surroundings.
The technology also provides the ability to edit these generated images by simply selecting an area in the image to replace with some new detail and then describing the missing visual element in the description box.
We decided to add a herd of deer to our render:
The possibilities raised by DALL-E 2 are considerable and it will be interesting to see how it develops and what new applications are found for it. From our perspective as architects, while it can currently only aid part of our overall design process, it could be an additional tool in the repertoire of new technologies we are already exploring and developing.
We have been testing new ways of designing, using current circular design principles, that combines our research into wellbeing and emotional response to places and spaces.
DALL-E 2 is not the only AI tool we have been exploring with design. Our data-centric approach to the design and use of buildings includes trialling facial expression recognition tools which measure human emotional responses to design. This is to analyse and understand how combining art, architecture and landscape in our design work can promote happiness and wellbeing.
The technology we have been testing on live projects uses computer vision which recognizes, interprets, processes human emotion based on facial expressions and eye gaze fixation. The programme analyses even the smallest facial movements to deliver data on emotional responses to a space (i.e. expressions of joy, surprise or disgust).
This human emotional response to design research programme has been recognised in a shortlisting for the AJ100 Innovation of the Year Award 2022 and within the Digital Innovation in Design category at the Digital Construction Awards 2022.
In all our research programmes our Works technology experts work closely with our Studio team, a group of our most experimental architectural designers who bring imagination and emotion to our designs, and Thrive who specialise in environmental issues and health and wellbeing. The three elements of the Chetwoods brand work together to advise and support our project teams at every stage of their design and delivery of a project.
To be the first to receive our latest research, industry articles, event invitations or news updates, please provide your details below to subscribe. You can unsubscribe at any time.