We’ve got a three-day weekend coming up here in the States, and it’s at the right time. I’m spent and looking forward to some sunshine and downtime. See you back at the Reading List on Tuesday!
[blog] Now’s a great time to rediscover PaaS. Platforms are in vogue, as they should be. I’m not sure Cloud Foundry is the right bet in 2025, but Coté makes a good case.
[blog] How We Decomposed Tinder’s Monolith. Faster build times, and a simpler system that can be updated more easily? Good work from the Tinder engineering team.
I’m catching up after a few days in Mountain View and starting to clear out the reading queue. Still a ways to go. Enjoy today’s list!
[blog] Vibing at Home. I like seeing people share their own experiences. Those who only pontificate online about AI coding are losing credibility with me. At least try it yourself before lamenting or praising it!
[blog] What’s new in Firebase at I/O 2025. The Firebase team is cooking right now. Check out this post to see all the ways that modern app dev is boosted with Firebase.
[blog] Your First Spring AI 1.0 Application. Many details and code samples in this post from Josh. He covers many important dimensions like observability and security too.
[blog] What’s new in Flutter 3.32. These seem like valuable updates. For those building cross-platform mobile or web apps, Flutter remains a great choice.
[blog] Takeaways from Coding with AI. Wow, this was a great collection of people opining about the plusses and minuses of software engineering and coding with AI. Read Tim O’Reilly’s recap.
Where you decide to run your web app is often a late-binding choice. Once you’ve finished coding something you like and done some localhost testing, you seek out a reasonable place that gives you a public IP address. Developers have no shortage of runtime host options, including hyperscalers, rented VMs from cheap regional providers, or targeted services from the likes of Firebase, Cloudflare, Vercel, Netlify, Fly.io, and a dozen others. I’m an unapologetic fanboy of Google Cloud Run—host scale-to-zero apps, functions, and jobs that offer huge resource configurations, concurrent calls, GPUs, and durable volumes with a generous free tier and straightforward pricing—and we just took the wraps of a handful of new ways to take a pile of code and turn it into a cloud endpoint.
Vibe-code a web app in Google AI Studio and one-click deploy to Cloud Run
Google AI Studio is really a remarkable. Build text prompts against our leading models, generate media with Gemini models, and even build apps. All at no cost. We just turned on the ability to do simple text-to-app scenarios, and added a button that deploys your app to Cloud Run.
First, I went to the “Build” pane and added a text prompt for my new app. I wanted a motivational quote printed on top of an image of an AI generated dog.
In one shot, I got the complete app including the correct backend AI calls to Gemini models for creating the motivational quote and generating a dog pic. So cool.
Time to ship it. There’s rocket ship icon on the top right. Assuming you’ve connected Google AI Studio to a Google Cloud account, you’re able to pick a project and one-click deploy.
It takes just a few seconds, and you get back the URL and a deep link to the app in Google Cloud.
Clicking that link shows that this is a standard Cloud Run instance, with the Gemini key helpfully added as an environment variable (versus hard coded!).
And of course, viewing the associated link takes me to my app that gives me simple motivation and happy dogs.
That’s such a simple development loop!
Create an .NET app in tools like Cursor and deploy it using the Cloud Run MCP server
Let’s say you’re using one of the MANY agentic development tools that make it simpler to code with AI assistance. Lots of you like Cursor. It supports MCP as a way to reach into other systems via tools.
We just shipped a Cloud Run MCP server, so you can make tools like Cursor aware of Cloud Run and support straightforward deployments.
I started in Cursor and asked it to build a simple REST API and picked Gemini 2.5 Pro as my preferred model. Cursor does most (all?) of the coding work for you if you want it to.
It went through a few iterations to land on the right code. I tested it locally to ensure the app would run.
Cursor has native support for MCP. I added a .cursor directory to my project and dropped in a mcp.json file in there. Cursor picked up the MCP entry, validated it, and showed me the available tools.
I asked Cursor to deploy my C# app. It explored the local folder and files to ensure it had what it needed.
Cursor realized it had a tool that could help, and proposed the “deploy_local_folder” tool from the Cloud Run MCP server.
After providing some requested values (location, etc), Cursor successfully deployed my .NET app.
That was easy. And this Cloud Run MCP server will work with any of your tools that understand MCP.
Push an open model from Google AI Studio directly to Cloud Run
Want to deploy a model to Cloud Run? It’s the only serverless platform I know of that offers GPUs. You can use tools like Ollama to deploy any open model to Cloud Run, and I like that we made even easier for Gemma fans. To see this integration, you pick various Gemma 3 editions in Google AI Studio.
Once you’ve done that, you’ll see a new icon that triggers a deployment directly to Cloud Run. Within minutes, you have an elastic endpoint providing inference.
It’s not hard to deploy open models to Cloud Run. This option makes it that much easier.
Deploy an Python agent built with the Agent Development Kit to Cloud Run with one command
The Agent Development Kit is an open source framework and toolset that devs use to build robust AI agents. The Python version reached 1.0 yesterday, and we launched a new Java version too. Here, I started with a Python agent I built.
Built into ADK are a few deployment options. It’s just code, so you can run it anywhere. But we’ve added shortcuts to services like Google Cloud’s Vertex AI Agent Engine and Cloud Run. Just one command puts my agent onto Cloud Run!
We don’t yet have this CLI deployment option for the Java ADK. But it’s also simple to use the Google Cloud CLI command to deploy a Java app or agent to Cloud Run with one command too.
Services like Cloud Run are ideal for your agents and AI apps. These built-in integrations for ADK help you get these agents online quickly.
Use a Gradio instance in Cloud Run to experiment with prompts after one click from Vertex AI Studio
How do you collaborate or share prompts with teammates? Maybe you’re using something like Google Cloud Vertex AI to iterate on a prompt yourself. Here, I wrote system instructions and a prompt for helping me prioritize my work items.
Now, I can click “deploy an app” and get a Gradio instance for experimenting further with my app.
This has public access by default, so I’ve got to give the ok.
After a few moments, I have a running Cloud Run app! I’m shown this directly from Vertex AI and have a link to open the app.
That link brings me to this Gradio instance that I can share with teammates.
The scalable and accessible Cloud Run is ideal for spontaneous exploration of things like AI prompts. I like this integration!
Ship your backend Java code to Cloud Run directly from Firebase Studio
Our final example looks at Firebase Studio. Have you tried this yet? It’s a free to use, full-stack dev environment in the cloud for nearly any type of app. And it supports text-to-app scenarios if you don’t want to do much coding yourself. There are dozens of templates, including one for Java.
I spun up a Java dev environment to build a web service.
This IDE will look familiar. Bring in your favorite extensions, and we’ve also pre-loaded this with Gemini assistance, local testing tools, and more. See here that I used Gemini to add a new REST endpoint to my Java API.
Here on the left is an option to deploy to Cloud Run!
After authenticating to my cloud account and picking my cloud project, I could deploy. After a few moments, I had another running app in Cloud Run, and had a route to make continuous updates.
Wow. That’s a lot of ways to go from code to cloud. Cloud Run is terrific for frontend or backend components, functions or apps, open source or commercial products. Try one of these integrations and tell me what you think!
It’s been a great day at Google I/O, and I think we made a big industry impact. I still have to catch up on what we wrote, and what others wrote about us, so expect a few of those in tomorrow’s reading list.
[blog] MCP Authorization in practice with Spring AI and OAuth2. Speaking of Spring, here’s a guide for setting up access security for MCP servers. Which we need to see more with all this widespread MCP support happening.
I’m in Silicon Valley today to get ready for Google I/O which starts tomorrow. It’s a welcome distraction as I had a complex weekend. Some great family time, but then one of my two dogs (Ellie) passed away. I loved that pup and had almost thirteen great years of companionship. Dogs are the best.
This was a blur of a week, and I try to avoid those. I usually do a better job of truly breaking it up to slow down the key parts. Not this week. But, it’s a good-looking weekend ahead, and unintuitively, Google I/O will likely be more low-key than “regular” work.
[blog] Building software on top of Large Language Models. Simon built some lands-on labs for a workshop on building apps that use LLMs. He’s using the wrong model, but nobody’s perfect. Still, great assets.
[docs] Generative AI glossary. A lot of terms get thrown around nowadays. We just published this new glossary of generative AI terms and I like the depth we provided for each one.
I’m getting ready for Google I/O next week, and excited to deliver the “what’s new in cloud” talk. It’s probably the most preparation I ever do for a talk, as it’s tricky speaking authoritatively about dozens of distinct things across an entire cloud portfolio. Livestream is available if you want to watch me fumble through it!
[blog] What does a Technical Lead do? Good post. Do you call these people architects? Something else? Either way, I like this effort to classify the work.
[blog] Google ADK + Vertex AI Live API. I’m happy when curious folks solve problems before solutions are well-documented. Here, Sascha integrates our Agent Development Kit with the amazing Gemini Live API, even though we haven’t really showed folks how to do it yet!
[blog] Platform Engineering: Evolution or Rebranding? Bit of both, depending on where you look. For some, it’s just a change in job title. But that misses the continued progress towards treating ops like a software engineering challenge.
Today’s list has an informative mix of advice and news. Learn how to build visual artifacts in your docs, how to design APIs, the right way to think about junior developers, and how to handle stress around layoffs.
[blog] CRUD APIs are Poor Design. Throwing create-read-update-delete operations on your core entities isn’t an API strategy. Derek proposes a better approach.
[blog] The Android Show: I/O Edition. Here’s a set of blogs from yesterday’s Android event that highlighted some visual updates, expansion to more devices, and some impressive stats.
[article] When Your Layoff Anxiety Won’t Go Away. I’ve seen people paralyze over this fear. It’s not an unreasonable one in this climate. But the incessant worrying will have negative effects.
I don’t know the “future of coding” with AI, but there are sure a lot of opinions out there. I was in a few forums today where it came up, and I’m seeing so much discussion on the interwebs. Even if you can have AI do all our coding, would that be what we want? I doubt it.
[blog] Graceful Shutdown in Go: Practical Patterns. Are you good at graceful shutdowns for your apps and systems? Here’s some good advice, even if you aren’t writing these patterns in Go.