Google Cloud Next OnAir’20 AMER recently delivered a virtual, nine-week program of learning opportunities and product insights from GCP. John Loughlin comments on some of his key takeaways from the event.

Google Cloud Next OnAir’20 has ended. In a program spanning nine weeks, the virtual event covered a wide range of topics (Infrastructure, Security, Data Analytics, Cloud AI, Application Modernization, etc.) and featured over 200 sessions providing learning opportunities with Google Cloud developers and insights from thought leaders.

Having looked forward to experiencing Next in its original – in-person – format, I was curious as to how a digital-only, remote version of GCP’s enterprise conference would land. I’m happy to say that I found real advantages in a paced out, virtual event. It allowed me the time to be deliberate about the sessions I wanted to attend and to truly focus on absorbing the enormous amount of material that the GCP team had to share. In fact, I grew to look forward to the weekly sugar high from the latest update to a newly stocked cloud candy store!

Here are a few highlights from the program.

Google Cloud’s three capabilities

In the opening keynote, Thomas Kurian said that the mission of Google Cloud is “to accelerate every organization’s ability to digitally transform and reimagine their business through data-powered innovation” and mentioned three capabilities to enable this.

  1. The first of these is distributed infrastructure as a service and GCP’s plan to provide the ability to run in the public cloud, hybrid cloud, multi-cloud, and edge using common APIs. For all of the sophisticated technology, the most exciting thing for me here was in a customer profile with Spotify where an engineer said “Google Cloud takes care of a lot of the complexity in our business…, and that frees us up to (focus on) Spotify specific things.”  One example of this is the assured workloads program that organizations can use to attest that they are running within a regulatory scheme or set of IT policies. I am eager to see how we can work with our clients who are free to focus on what is unique to them.
  2. The second is the digital transformation platform that allows application modernization enabled by Anthos. Google Cloud announced support for Anthos running on bare metal platforms and extensions to GCP’s data and analytics portfolio including BigQuery Omni, streaming analytics, and a natural language query interface. I am interested to see Google Cloud’s approach here, not just in broadening the platform for Anthos to multi-cloud and the telecoms edge, but also providing data to inform decision making. This and the announcements in the data and analytics suite support Cloudreach’s work to help clients make data-driven decisions regarding their application modernization efforts and adoption of more sophisticated analytics.
  3. The third, industry-specific digital transformation delivered by solutions powered by Google Cloud’s AI/ML solutions.  On this topic, there were numerous examples from more traditional verticals like retail (Carrefour) and finance (Goldman Sachs), to digital natives like Twitter and Spotify. I expect that by offering industry insights informed by Google Cloud’s AI/ML, Google Cloud will raise the level of efficiency across the entire industry. For Cloudreach, I see an opportunity to build better products, improve customer experience and increase innovation with our clients, who, having used these insights to resolve the more commonplace industry concerns, are free to focus on what differentiates them.

Key announcements

There were a few announcements over the course of the event that we were particularly intrigued by. BigQuery Omni, for a start. Rich Pilling, the Lead for Data and Analytics wrote about why we are excited about BigQuery Omni here. Our customers are exploring multi-cloud models with increasing frequency and any tools that help us to deliver this are most welcome.

At Cloudreach some of our data scientists and data architects have developed a reference architecture for a machine learning process to bring some of the value disciplines of modern software engineering to ML development. We were especially interested in Nate Keating’s “An Introduction to MLOps on Google Cloud”, which is focused along the same lines.

Google Cloud also announced a Feature Store, a  centralized repository of historical and current feature values. This will enable reuse on teams that ordinarily spend much of their time on feature engineering, ensuring consistency across the organization.

Another feature that will prove valuable to our clients is the Continuous Monitoring service. This will monitor production models and let them know of outliers, errors, skews, and concept drifts. These services will allow our clients to build out their model development and deployment more efficiently and with greater confidence.

There’s lots of news in the security sphere as well, with the announcement of a Google Cloud Security best practices center. This features expertise from Google and partners (Cloudreach is the security partner of the year, two years running), and the Google Cloud Security Showcase, with dozens of videos on how to address dozens of specific Security challenges. Google Cloud has also introduced Confidential VMs which provide memory encryption.  Formerly you could encrypt data in transit and at rest. Now you can process encrypted data as well.

I could probably write an entire book on the insights and announcements that came from the last nine weeks. Rather than do that, I will direct you to the Google Cloud website where the material from the event l is still available online.

Google Cloud continues to rapidly innovate and provide new ways for enterprises to adopt and find value in the cloud. I can’t wait to deliver these solutions for our customers and see more innovations next year (hopefully in-person!).

For more information about our Google Cloud practice click here.