Building a webapp to visualize change in historic average temperatures
In this article I will show how I built a webapp to visualize the change in average temperatures in Austria. I used data from the Geoshpere Austria Api and built the app using fastapi, duckdb and Google Cloud Run. I will discuss the steps I took to build the app, the challenges I faced and how I solved them.
Recently I stumbled over the Geosphere Austria API which provides a lot of data from all their weather stations in Austria. I thought I could turn this into an interesting project, while also learning more about fastapi and web development in general.
So let us start by discussing the architecture I chose and the steps I took to build the app. You can also find all of the code on my GitHub repository, and the app itself is hosted on climateaustria.vinzenzhalhammer.com.

Note that this article is not a step-by-step tutorial, but rather an overview of the architecture and the steps I took to build the app. I also only talk about the technical aspects here and not about the data itself. This is something I want do in a future article.
Data pipeline
I tried to keep the data pipeline as simple as possible. I chose duckdb as the database to store the data, because it is a fast and lightweight database that can be used in memory and does not require a separate server. I then started looking at the data provided by the Geosphere Austria API and picked the klima-v2-1y dataset which contains historic yearly average temperatures for each weather station in Austria.
Note that the homepage is only in german but the API documentation is in english if you want to work with it yourself. I also want to give credits to the Geosphere Austria team for providing a really good API with a lot of data and a good documentation. This is not always the case with public APIs, so I really appreciate their work.
For simplicity I decide to use one script data_load_pipeline.py
which handles data base setup, table creation, data fetching and creating the final view which is used by the frontend.
As we deal with yearly data (and I wanted to keep the cost minimal) I initially did not worry about setting up a scheduled job to update the data, but I still might add this added later on.
In the pipeline I am fetching the weather station metadata and the yearly measurements from the API and store them in two separate tables. Before the insertion some basic data cleaning is done, such as removing duplicates and converting the data types to the correct format. As all the weather stations have measurements starting in different years, I also filter the data to only include measurements from 1950 onwards. This is done to have a consistent dataset for the visualization.
Finally I create two more tables, one with a 10 year rolling average of the yearly measurements which is used for the visualization and one with average temperatures and temperature difference for the two periods 1950-1970 and 2000-2024. Then I create the final view which contains all necessary information for the frontend to display the data.
CREATE OR REPLACE VIEW station_frontend_data AS
SELECT
s.id AS id,
s.name,
s.latitude,
s.longitude,
ss.pre1970_temp,
ss.post2000_temp,
ss.delta_temp,
sys.year,
sys.tl_mittel,
sys.rolling_avg_temp_10y
FROM stations s
JOIN station_summary ss ON s.id = ss.id
JOIN station_yearly_stats sys ON s.id = sys.id;
Now we have all the data ready to be used in the frontend. The data is stored in a duckdb database which is loaded into memory when the app starts.
Webapp backend with fastapi
For the backend I chose fastapi, because it is a modern and fast web framework and I always wanted to try it out. In the backend the data is loaded to memory in a pandas dataframe from the duckdb database and then served via an endpoint.
The whole setup is quite simple, I just defined two routes. The first route is the root route which serves the HTML file for the frontend and the seconds route is my data endpoint which serves the correct data for the frontend. The data is served as a JSON object which is then used by the frontend to display the data. Every time a user picks another weather station, the frontend sends a request to the backend to get the data for that station.
Here is the data route as an example.
@app.get("/data")
async def get_city_data(town: str = "Aigen im Ennstal") -> JSONResponse:
"""Returns climate data for a given town as JSON.
Args:
town (str, optional): The selected town. Defaults to "Aigen im Ennstal".
Returns:
JSONResponse: JSON containing climate data for the selected town.
"""
selection = TOWN_ID_MAPPING.get(town, 105)
df = get_station_summary()
df = df[df["id"] == int(selection)]
pre_industrial = df["post2000_temp"].iloc[0]
modern_avg = df["pre1970_temp"].iloc[0]
delta = df["delta_temp"].iloc[0]
labels = df["year"].tolist()
smoothed_data = df["rolling_avg_temp_10y"].tolist()
return JSONResponse({
"town": town,
"pre_industrial": pre_industrial,
"modern_avg": modern_avg,
"delta": delta,
"labels": labels,
"smoothed_data": smoothed_data
})
Webapp frontend with tailwindcss and JavaScript
For the frontend I used tailwindcss for styling which I already used for my personal website and I really like it. It makes it easy to create responsive and modern looking UIs without writing a lot of CSS which is especially helpful if you don't have a lot of experience in web development. I used leaflet for the interactive map and apexcharts for the chart. I already new leaflet from python and I really like it, so I decided to use it for the frontend as well. I stumbled across apexcharts when looking for inspiration on Flowbite and I really liked the look of it, so I decided to use it for the chart. I also used the modal component from Flowbite to display more information about the data and the project itself.
After setting up the basic structure of the frontend I started implementing the map and the chart. The map is used to display and select weather stations and the chart is used to display the temperature data for the selected station. I used JavaScript to handle the interactions between the map, chart and the dropdown menu. One could probably also use a framework like React or Vue.js for this, but I never did before, so I sticked to plain JavaScript. However, this is something I would like to try out in the future.
This is it for now regarding the frontend, for more details on the implementation you can check out the index.html
file
in the GitHub repository. I will not go into detail here, as the code is quite self-explanatory and I want to keep the article focused on the overall architecture and the steps I took to build the app.
Deployment & hosting of the webapp
I decided to host the webapp on Google Cloud Run, because it is a serverless platform that allows you to run your app in a container without having to worry about the underlying infrastructure. I also wanted to keep the costs low, and Cloud Run has a generous free tier that includes 240 000 vCPU-seconds and 450 000 GiB-seconds of free usage per month. Additionally I let the app scale down to zero when it is not used, this increases the start up time a bit but this is not a concern for such a side project.
I used the built in features of Cloud Build to directly build the container image from the GitHub repository and deploy it to Cloud Run. This is a very convenient way to deploy your app. To keep it quick and simple I set everything up via the Google Cloud UI, but you can also do it via the command line or using Terraform. As I am using uv for managing my project dependencies, I also use it to build the container image, which is done with the dockerfile below.
FROM ghcr.io/astral-sh/uv:python3.11-bookworm-slim
EXPOSE 8080
WORKDIR /app
COPY . /app
RUN uv sync --locked
ENTRYPOINT ["uv", "run", "uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8080"]
So now we have the container image up and running on Cloud Run and we can access the webapp via the URL provided by Cloud Run. But I also wanted the webapp to be accessible via my custom domain.
I already had a domain which I registered with Cloudflare for my personal website, so I just added a new subdomain climateaustria.vinzenzhalhammer.com
.
The next step is pointing the subdomain to the Cloud Run service. This was trickier than I thought. However after a bit of research I figured out how to do it. You have to add a custom domain mapping in the Cloud Run service settings which will then provide you with a CNAME record that you have to add to your DNS settings in Cloudflare.
So far, so good, but I also wanted to have a SSL certificate for the subdomain to make it secure. This is where I was stuck for a moment. Google Cloud Run uses a HTTP-01 challenge to verify the domain ownership and issue a SSL certificate (more on that here).
If you keep Cloudflare in standard setting it will not allow HTTP requests to your service, because it will always redirect them to HTTPS. This is a problem because the HTTP-01 challenge requires a HTTP request to verify the domain ownership.
To solve this problem you have to turn off automatic HTTPS rewrites in the Cloudflare settings for your subdomain. Additionally you have to set the SSL/TLS encryption mode to "Full" without strict.
I created a rule for the challenge path /.well-known/acme-challenge/
to explicitly allow HTTP requests. That way the certificate renewal should also work in the future.
Google also mentions something along those lines in their documentation but do not provide more detail.
After this "deeper than initially planned" dive into certificate issuing and domain mapping, I finally had the webapp up and running on my custom domain with a valid SSL certificate. Perfect!
What I learned
The biggest challenge was definitely the domain mapping and SSL certificate issue, but I learned a lot about web development and DNS settings in the process. I also learned a lot about fastapi and how to build a webapp with it. The documentation is really good and it is easy to get started. I really enjoyed this mini project and I am happy with the result. Maybe I will add more features in the future or work with the Geoshpere Austria API on another project.
I think it useful as a Data Scientist to have some experience with web development and building webapps, as it allows you to share your work with others and make it more accessible. Often a Streamlit app is enough, but sometimes you reach the limits of what you can do with it and then it is good to have more flexibility with a custom webapp.
You can try the app here: climateaustria.vinzenzhalhammer.com and you will find the code on my GitHub repo: climate_change_austria. You can a also get the app running yourself by following the instructions in the README file. I'd also love feedback - either technical or about the design/UX. Feel free to contact me here or send me an email at info@vinzenzhalhammer.com. Feel free to also reach out if you have any questions about the implementation or the technologies used. Have fun exploring the app and hope to see you in my next article!