Daily Wrap Up – March 14, 2023 (#045)

Big day for AI. OpenAI shipped GPT4 and Google announced a host of AI-infused Cloud products. Sit back and learn, even if you’re not ready to dive into that domain. Check out lots of other things below that I consumed today.

[blog] The next generation of AI for developers and Google Workspace. One way or another, office work won’t be the same. Generative AI is going to have an impact. As you might expect, I’m fairly intrigued and excited about what we’re doing, and there’s a lot more to come.

[youtube-video] A new era for AI and Google Workspace. What might “new” office work look like? This video gives you a glimpse into what assistive AI can do.

[article] How to Help Superstar Employees Fulfill Their Potential. I needed this. Sometimes I feel stuck about which direction to go when mentoring folks at work. This gives me a few structured dimensions to focus on.

[article] The problem with development speed. Good stuff here on not accidentally focusing on pure velocity. More code doesn’t equal more impact. Learn faster. That’s the key.

[blog] Failure Mitigation for Microservices: An Intro to Aperture. This is an excellent post on the types of failures that can happen in a distributed, microservices architecture and how you might mitigate.

[blog] Unlocking Real-time Predictions with Shopify’s Machine Learning Platform. “The cloud” is really a platform for your platform. Here, the Shopify team goes deeper into the ML experience they built for their users.

[youtube-video] Pub/Sub Best Practices: Patterns, Experimentation, and Testing. Even if you’re not using Pub/Sub (I’ll still be friends with you) there’s advice here that impacts many messaging scenarios.

[blog] What is fault tolerance, and how to build fault-tolerant systems. The team at CockroachDB looked at what fault tolerance means, how it differs from high availability, and how to architect for fault tolerance.

[blog] Building the most open and innovative AI ecosystem. You can likely count on one hand the number of companies with the necessary compute capacity to train and serve massive large language models. It’s all about the ecosystem, as this post explores.

##

Want to get this update sent to you every day? Subscribe to my RSS feed or subscribe via email below:

Author: Richard Seroter

Richard Seroter is Director of Developer Relations and Outbound Product Management at Google Cloud. He’s also an instructor at Pluralsight, a frequent public speaker, the author of multiple books on software design and development, and a former InfoQ.com editor plus former 12-time Microsoft MVP for cloud. As Director of Developer Relations and Outbound Product Management, Richard leads an organization of Google Cloud developer advocates, engineers, platform builders, and outbound product managers that help customers find success in their cloud journey. Richard maintains a regularly updated blog on topics of architecture and solution design and can be found on Twitter as @rseroter.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.