← Back to Challenges
Generative AI

Challenge 01 — Climate Intelligence

Problem Statement

Climate affects all of us. But understanding what we can actually do about it is hard.


Switch energy provider or stick with the current one? Does your business comply with the latest regulations — or not? Which government grants are available, and do you qualify? The answers exist somewhere, buried in government reports, legal databases, and scientific datasets that most people will never read.


The challenge is to use AI to close that gap. Build a tool that takes the complexity out of climate action — something that can read the data, understand the question, and give a real person a clear, trustworthy answer they can act on.

Data Available

The following open-source datasets are suggested as starting points. You are not limited to these — use whatever sources best support your solution.

Data Integration

Your solution must demonstrate that the AI is grounded in real data — not generating responses from training knowledge alone. Data integration is intentionally broadly defined and includes any of the following:

  • Calling a dataset's public API or downloading a structured file (CSV, JSON) and referencing it in prompts
  • Retrieval-Augmented Generation (RAG) using a vector store
  • Structured prompting with pre-loaded excerpts from a dataset
  • Web search grounding against one of the listed sources

A full RAG pipeline is not required. A team that retrieves a relevant CSV row and passes it to an LLM with a well-designed prompt is meeting this requirement. What matters is that responses are traceable to a real source.

Expected Output

  • Functional Prototype — A web or mobile application (e.g. a Streamlit dashboard or chat interface) that utilises an LLM.
  • Data Integration — Demonstration that the AI grounds responses in at least one dataset using any of the approaches above.
  • User Persona Scenario — A clear demonstration of how a specific user (e.g. a business owner or local student) gains value from the tool.
  • Technical Documentation — A brief overview of the architecture, including which dataset(s) are used and how.

Demonstrating Reliability

Rather than documenting hallucination mitigation in the abstract, your presentation must include a live demonstration of at least one of the following:

  • The tool declines to answer a question that falls outside the scope of its grounding data, rather than fabricating a response
  • The tool presents an answer with an explicit citation or uncertainty qualifier (e.g. "According to Fedlex SR 814.01, your business is required to..." or "Based on opendata.swiss energy data for your canton, the available grants are...")
  • The tool flags when a user's query cannot be answered with confidence and suggests an alternative source

This will be evaluated by judges during the 5-minute presentation.

Success Criteria

  • Trust & Accuracy — The tool provides factually grounded information, with responses traceable to the source data.
  • Localisation — The solution handles regional context where relevant (e.g. a specific EU directive, US federal regulation, or Swiss cantonal policy).
  • Accessibility — The UI/UX simplifies complex concepts for non-technical users.
  • Impact Potential — The degree to which the tool could realistically lead to measurable climate action or improved policy compliance.
Register via Climate Week Zurich