Senior Architect, Charlie Brett took part in a panel talk the Workspace design show all about AI and Sustainability in Design. The panel explored whether AI is helping or hindering designers and specifier’s creativity. How best do we harness man and technology to create beautifully designed workplaces that encourage productivity and wellness, while saving the planet? Charlie leads our research and development of AI power design and analytical tools, bringing valuable insight into how technology is shaping the industry. Here are his key learnings.
The three types of AI and where we actually are
It helps to be clear about what we mean when we say “AI” in an architectural context, because the word is doing a lot of work across very different tools. My view is that AI in architecture currently falls into three broad categories: Large Language Models (LLMs) like Claude, ChatGPT, and Perplexity, which process and generate text and numerical data; Visual models like Midjourney, NanoBanana and Stable Diffusion, which generate images from text or image prompts to support visualisation; and Generative/spatial AI, which overlaps with parametric and algorithmic design to spatialise data in two and three dimensions.
Of the three, LLMs and visual models are the most mature and the most accessible to practice right now. Spatial AI is the most architecturally exciting in the long run, but it’s also the least ready. The biggest near-term opportunity – and the one I want to focus on – sits firmly in the first category.
The Case for Right-Sizing with Evidence
Our practice has been testing LLMs for space programming of offices, taking a client brief, aggregating survey data, and using a custom AI environment to generate and compare space programme outputs. We recently ran this process for a client team of roughly 200 staff, feeding in both the brief and a body of qualitative survey feedback, conversational and text-based responses about how their team’s work. We then compared four outputs: two AI generated (Claude and ChatGPT), alongside two produced by experienced human designers through conventional methods.
The results were striking. Assessed against a matrix of criteria including the client’s own design standards, BCO guidelines, benchmarking from comparable sites, WELL Building Standard requirements, and BREEAM criteria, the Claude output achieved the highest overall score. More interesting to me than who “won,” though, was what the comparison revealed about the nature of the task itself.
The variance between the largest and smallest programme was approximately 450 m², for the same client, the same staff number, the same building. By our rough calculations, that gap represents roughly 368 tCO₂e over ten years of operation, approximately 25% of the floor plate. That difference had nothing to do with ambition or quality of thinking, only methodology.
This is, I’d argue, the single most important point in the AI and sustainability conversation for our sector. Buildings and construction are responsible for around 37% of global CO₂ emissions. ¹ The closer we get to completed construction, the less impact we can have; at that stage we are down to specification swaps and VE exercises, which are useful but limited. The biggest lever we can pull is at the brief stage, before a line is drawn. But the key word here is not reduction, it is optimisation. Investing an extra 20 square metres in a genuinely flexible, well serviced space at the outset might avoid the need for 400 square metres of reactive, poorly planned expansion later. Getting the spatial brief right is only part of it. AI’s ability to hold multiple competing data sets simultaneously also opens up the potential to optimise across materials, MEP strategy, and structural efficiency in ways that compound the benefit.
“AI’s ability to hold multiple competing data sets simultaneously also opens up the potential to optimise”
Not Value Engineering: Optimisation Over Time
The panel discussed whether AI driven space optimisation is just value engineering with better branding.
In my experience, value engineering happens at the end of a design process, when costs are too high, the brief is fixed, and you’re looking for things to remove or cheapen. AI space programming can have an impact throughout the entire lifecycle of a workspace. Its aim is not to strip back. It’s to calibrate and then keep calibrating.
Calibration can mean smaller, but it can equally mean larger in specific provisions. If teams repeatedly flag in survey feedback the need for more quiet focus space, and that need isn’t properly addressed, the spaces on offer will be underutilised. Organisations will almost certainly then take on additional space on an ad hoc basis when a considered conversion of an underused larger room could have been more appropriate, more sustainable, and better designed. This puts an enhanced emphasis on flexibility, not just at a space’s inception, but across its entire life. The AI may not be able to determine how to make a space physically flexible; that is the designer’s job, but it can absolutely identify a need that we should be designing for.
The data is often already there to tell us what people need, and AI gives us the means to listen to it properly, even in written staff sentiment. Latest figures suggest 48% of desks in London are currently underutilised,² and globally around 64% of office space is underused despite ongoing return to office pressure.³ The task is to listen to that data and develop the workflows that let it inform the brief, before construction, not after, and crucially, to keep listening once the building is occupied.
This is not a one-time exercise. Perhaps the modern workplace could be thought of less like a fixed architectural object and more like an app: something that should be constantly developing and responding to the needs of its users, updated iteratively within a circular economy rather than demolished and rebuilt when it falls short. Modern smart buildings, equipped with occupancy sensors and integrated data systems, already give us the infrastructure to track this in real time. Automated feedback loops can signal when a space is no longer serving its purpose and needs to change, creating an iterative, data driven relationship between buildings and the people inside them. That conversation, if we design for it from the outset, points towards spaces that are inherently modular and flexible, capable of adapting without reconstruction. A 10-person meeting room that nobody books, in a building where four person rooms are perpetually oversubscribed, shouldn’t require an extensive refurbishment to fix. It could have been designed to become two four person rooms the moment the staff feedback data told us it needed to.
Creativity: A Tool, not a Silver Bullet
There was a concern on the panel that AI risks homogenising architectural output, that practices will congregate around a single aesthetic as image models make certain visual styles ubiquitous. This is a real risk, but only if we let it happen. It will happen if we use AI to generate aesthetics without judgement or critical input. We should actively guard against this form of laziness as an industry. New tools have always shaped architectural trends, from CAD to BIM to parametric design, and trends have always emerged iteratively until someone pushes against them. AI is not structurally different in that respect. It is a tool and an assistant. It is not magic. Generally, over the history of architecture these tools have allowed an expansion of our capacity and capabilities as designers but it’s up to us to decide how and where to use them.
What I find more interesting is how AI is already changing the way we communicate about architecture. We interact with most AI tools primarily through text, which shifts how we describe spatial intent. The format and vocabulary required to get a useful output from an image model is potentially quite different from the vocabulary an architect might use with a client or a contractor. Architects are well placed as a conceptual mediator in this way, given we spend our careers translating between technical, creative, and client discussions simultaneously.
One moment from the panel stood out on this point, with a panellist describing a client arriving at their office with an AI generated image of what they wanted their space to look like. It’s an interesting scenario and one we may see more of, but in my experience this kind of approach will only move the discussion on so far. There is still significant iteration between a prompt generated image and a considered, buildable design, but it signals a shift in how clients form and communicate their expectations. Being part of that collaborative, iterative process at speed, with new tools, is genuinely exciting.
“What AI cannot do is supply the rigour, the technical understanding, or the spatial judgement required to turn an evocative image into a real building.”
AI’s Own Footprint
I don’t think we should dodge the environmental cost of AI itself. Data centres currently represent around 1% of global CO₂ emissions, ⁴ with projections suggesting this could rise to between 3% and 8% as adoption accelerates. That’s a legitimate concern, and the sector needs to get its energy sourcing onto a sustainable footing: solar, hydroelectric, geothermal, nuclear. The current reliance on fossil fuels is a real problem, not a theoretical one.
Water usage receives less attention but matters too: data centres compete for fresh water at significant scale, and in a world where only around 1% of the earth’s water is readily drinkable, ⁵ that competition is not trivial. Desalination is technically available but energy intensive, another reason why decarbonising the energy supply is the necessary prerequisite to sustainable AI at scale.
That said, AI can still perform a role in decarbonising our industry right now. Buildings and their construction account for around 37% of global emissions. ¹ A 5 to 10% improvement in efficiency across the built environment, spanning spatial, material, structural, and operational gains, enabled by AI, would generate carbon savings that substantially outweigh the computing cost of generating them. We can also play our part in the meantime: a basic search engine uses a fraction of the energy of an AI query, so choosing the right tool for the task matters; and starting a new chat for each topic rather than running continuous threads reduces token consumption and the energy that goes with it. The tool is worth running. It just needs to be run thoughtfully, and ultimately on clean energy.
What Comes Next
A few threads from the panel worth watching. Material passports and AI assisted specification offer significant untapped potential: agentic workflows that keep material data current automatically (scanning for lower carbon alternatives, cross referencing EPDs, and flagging substitutions before they are specified) address one of the most time consuming and error prone parts of the process. We’ve also been doing early testing on hyperlocal material sourcing, using AI generated code to build a simple tool that returns suppliers within 50 to 100 miles of a site postcode for a given product type. It’s rough, but the fact that an architect with no coding experience can build and test something like that in a few hours is genuinely exciting. The barriers between having an idea and being able to test it are shrinking rapidly.
GDPR was raised as a concern by the panel, and rightly so. In our testing we used anonymised, aggregated data and developed an AI use policy that considered the risks involved, working within a sandboxed environment. Any practice deploying AI on client briefs needs a clear policy and a clear understanding of where data is going. This should be the starting point, not an afterthought.
The final point I keep returning to is that the most sustainable building is the one we don’t have to build. The second most sustainable is the one optimised to its requirements at any given moment, able to respond to what people need, informed by a continuous flow of data about how they work, and designed with enough quality, flexibility, and intelligence to stay relevant for decades rather than a single lease cycle. AI is an increasingly capable tool for achieving that, not as a replacement for the designer’s judgement, but as the analytical foundation that makes that judgement faster, better informed, and more confident. We as designers are then liberated to do what we are here to do: use our imagination to create beautiful, human centred workplaces that work hard for the people inside them.
“We’re at an early stage, and the workflows are still being figured out. But the direction is clear, and the opportunity, for the profession and for the performance of what we build, is significant.”
References
- UNEP / Global Alliance for Buildings and Construction, 2024 Global Status Report for Buildings and Construction. Buildings and construction sector: approximately 37% of global energy related CO₂ emissions.
- Advanced Workplace Associates (AWA), Utilisation Study. 48% average desk utilisation across 36,100 desks, 75 buildings, over 5 years. Also cited in BCO and CBRE/JLL 2024 surveys.
- JLL / CBRE, 2024 Office Occupancy Research. 64% of global office space underutilised despite return to office pressures.
- International Energy Agency (IEA), Electricity 2025. Data centres approximately 1 to 1.5% of global electricity related CO₂ emissions as of 2024; projected growth scenarios to 2030.
- United States Geological Survey (USGS) / UN Water. Approximately 71% of Earth’s surface is water; less than 1% is readily accessible fresh water for human use.