Daily Wrap Up – March 14, 2023 (#045)

Big day for AI. OpenAI shipped GPT4 and Google announced a host of AI-infused Cloud products. Sit back and learn, even if you’re not ready to dive into that domain. Check out lots of other things below that I consumed today.

[blog] The next generation of AI for developers and Google Workspace. One way or another, office work won’t be the same. Generative AI is going to have an impact. As you might expect, I’m fairly intrigued and excited about what we’re doing, and there’s a lot more to come.

[youtube-video] A new era for AI and Google Workspace. What might “new” office work look like? This video gives you a glimpse into what assistive AI can do.

[article] How to Help Superstar Employees Fulfill Their Potential. I needed this. Sometimes I feel stuck about which direction to go when mentoring folks at work. This gives me a few structured dimensions to focus on.

[article] The problem with development speed. Good stuff here on not accidentally focusing on pure velocity. More code doesn’t equal more impact. Learn faster. That’s the key.

[blog] Failure Mitigation for Microservices: An Intro to Aperture. This is an excellent post on the types of failures that can happen in a distributed, microservices architecture and how you might mitigate.

[blog] Unlocking Real-time Predictions with Shopify’s Machine Learning Platform. “The cloud” is really a platform for your platform. Here, the Shopify team goes deeper into the ML experience they built for their users.

[youtube-video] Pub/Sub Best Practices: Patterns, Experimentation, and Testing. Even if you’re not using Pub/Sub (I’ll still be friends with you) there’s advice here that impacts many messaging scenarios.

[blog] What is fault tolerance, and how to build fault-tolerant systems. The team at CockroachDB looked at what fault tolerance means, how it differs from high availability, and how to architect for fault tolerance.

[blog] Building the most open and innovative AI ecosystem. You can likely count on one hand the number of companies with the necessary compute capacity to train and serve massive large language models. It’s all about the ecosystem, as this post explores.

##

Want to get this update sent to you every day? Subscribe to my RSS feed or subscribe via email below:

Author: Richard Seroter

Richard Seroter is currently the Chief Evangelist at Google Cloud and leads the Developer Relations program. He’s also an instructor at Pluralsight, a frequent public speaker, the author of multiple books on software design and development, and a former InfoQ.com editor plus former 12-time Microsoft MVP for cloud. As Chief Evangelist at Google Cloud, Richard leads the team of developer advocates, developer engineers, outbound product managers, and technical writers who ensure that people find, use, and enjoy Google Cloud. Richard maintains a regularly updated blog on topics of architecture and solution design and can be found on Twitter as @rseroter.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.