Home

Google Fi SIM Card Kit. Choose between the Simply Unlimited, Unlimited Plus and Flexible plans based on your data usage. 4G LTE and nationwide 5G coverage included for compatible phones.

Google LLC is an American multinational technology company that specializes in Internet-related services and products, which include online advertising technologies, a search engine, cloud computing, software, and hardware. Google was launched in September 1998 by Larry Page and Sergey Brin while they were Ph.D. students at Stanford University in California. Some of Google’s products are Google Docs, Google Sheets, Google Slides, Gmail, Google Search, Google Duo, Google Maps, Google Translate, Google Earth, and Google Photos. Play our Pac-Man videogame.

Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford University in California. The project initially involved an unofficial "third founder", Scott Hassan, the original lead programmer who wrote much of the code for the original Google Search engine, but he left before Google was officially founded as a company. Read the full story...
Clothing & Jewelry —— Cellphones —— Microsoft Products —— All Products


Google Blog
TwitterFacebookInstagramYouTube

  • Meet the entrepreneur connecting Kenyans to healthy food Mon, 16 May 2022 05:00:00 +0000

    When Binti Mwallau started Hasanat Ventures, her dairy processing company in Kenya, she expected some resistance from her peers in an industry dominated by men. But she was surprised to run into more skepticism from her customers. Despite her background in finance and biochemistry, many of them questioned her credibility as a woman entrepreneur.

    Worried that her gender would affect Hasanat Ventures’ reputation, Binti considered hiring a man as the face of the business. But she eventually decided against it, standing firm in her pride as a solo founder and committed to tearing down the perception that women-run businesses in Africa aren't as successful as those run by men.

    “I think we should be challenging the outdated narrative that businesses run by men are guaranteed to be more successful,” Binti says. “Based on research, we've seen that businesses run by women actually perform better. We should use this as an opportunity to prove that as a woman, you do stand a chance to succeed in everything that you do.”

    Just as important to Binti as breaking this bias was giving Kenyans more access to affordable nutrition. “I realized that many people couldn’t afford premium yogurt. So we entered the market with a high-quality product that’s affordable for lower and middle-income earners who have become more health-conscious,” she says.

    Binti knew she had to drive awareness for her brand, particularly to reach Kenyans who needed convincing about yogurt’s health benefits. So she turned to Google Digital Skills for Africa, which offers virtual classes to help entrepreneurs grow their skills and businesses, and completed a digital marketing course to help her get Hasanat Ventures online.

    “After participating in the course, we knew our online presence had to be bigger than just social media,” Binti says. “Now that we have a fully functional website, we are actually getting leads from outside Kenya.”

    As part of the course, Binti learned how to use Google Analytics to measure her website’s performance. She could now monitor traffic insights, analyze pageviews and better understand who was visiting her site.

    Binti’s determination and passion for her business are showing up in the results. In its first year, Hasanat Ventures supplied over 300 retailers with affordable dairy products. Three years later, it’s grown to support more than 50 farmers and even built its own production facility to keep up with demand.

    “I really want to make sure that I am visible and speaking up in spaces women don’t usually have access to,” Binti says. “As Hasanat Ventures continues to grow, I am confident I can help change the perception of African women in business.”

    58% of Africa’s entrepreneurs are women. That’s why we’re empowering them with the platform and tools to grow their businesses. Learn more about our #LookMeUp campaign, highlighting Africa’s women entrepreneurs like Binti who are working to break the bias.

  • Stadia Savepoint: April updates Thu, 12 May 2022 17:00:00 +0000

    It’s time for another round of our Savepoint series, where we recap the new games, features and updates available on Stadia.

    A video recapping Stadia's highlights from April 2022
    10:25

    In April, players took a swing at the leaderboard in Golf With Your Friends. Meanwhile, those who wanted a bit more action fought the undead in House of the Dead: REMAKE, a callback to the original 1997 arcade classic. Like any game on Stadia, players could jump into these titles right away, without waiting for downloads or installs.

    Active Stadia Pro subscribers enjoyed four additions to the Pro games library, including two new titles that launched on Stadia — World War Z: Aftermath and City Legends - The Curse of the Crimson Shadow. Pro subscribers could also claim Chicken Police - Paint it RED! and Ys IX: Monstrum Nox at no cost to add to their growing library of titles. It’s easy to try out new games like these in the Pro library, especially since creating a new Stadia account includes a one-month trial of Stadia Pro.

    We’re continuing to make it easier for players to try Stadia in seconds, without having to create an account. In April, we doubled the number of timed trials for full games on Stadia, adding seven new trials — including World War Z: Aftermath, HUMANKIND and DRAGON QUEST XI S: Echoes of an Elusive Age. You can now choose from 13 trials to play, all at no cost.

    A GIF shows a user progressing through black Stadia menus and entering a blue gameplay environment in Risk of Rain 2, controlling a red character as they attack blue enemies firing orange fireballs.

    Stadia feature updates

    • Store game page refresh: Check out all content bundles, sales, available trials and add-on content for Stadia store games with an updated look on web and mobile web.
    The Stadia store page for Ys IX: Monstrum Vox shows a trial, bundled content, and other actions for users to try.
    • Public parties game search: Create a public party for any multiplayer game when playing with a new game search bar on web.

    Stadia Pro updates

    April content launches on Stadia:

    Stadia announcements in April:

    • DEATHRUN TV
    • Five Nights at Freddy’s: Security Breach

    As always, we’ll be back next month to share another recap. In the meantime, keep an eye on the Stadia Community Blog, Facebook, YouTube and Twitter for the latest on new games, features and updates.

  • Shared success in building a safer open source community Thu, 12 May 2022 16:35:00 +0000

    Today we joined the Open Source Security Foundation (OpenSSF), Linux Foundation and industry leaders for a meeting to continue progressing the open source software security initiatives discussed during January’s White House Summit on Open Source Security. During this meeting, Google announced the creation of its new “Open Source Maintenance Crew” — a dedicated staff of Google engineers who will work closely with upstream maintainers on improving the security of critical open source projects. In addition to this initiative, we contributed ideas and participated in discussions on improving the security and trustworthiness of open source software.

    Amid all this momentum and progress, it is important to take stock on how far we’ve come as a community over the past year and a half. In this post we will provide an update on some major milestones and projects that have launched and look towards the future and the work that still needs to be done.

    Know, Prevent, Fix

    A little over a year ago we published Know, Prevent, Fix, which laid out a framework for how the software industry could address vulnerabilities in open source software. At the time, there was a growing interest in the topic and the hope was to generate momentum in the cause of advancing and improving software supply-chain security.

    The landscape has changed greatly since then:

    • Prominent attacks and vulnerabilities in critical open source libraries such as Log4j and Codecov made headline news, bringing a new level of awareness to the issue and unifying the industry to address the problem.
    • The US government formalized the push for higher security standards in the May 2021 Executive Order on Cybersecurity. The release of the Secure Software Development Framework, a set of guidelines for national security standards on software development, sparked an industry-wide discussion about how to implement them.
    • Last August, technology leaders including Google, Apple, IBM, Microsoft, and Amazon invested in improving cybersecurity — and Google alone pledged $10 billion over the next five years to strengthen cybersecurity, including $100 million to support third-party foundations, like OpenSSF, that manage open source security priorities and help fix vulnerabilities.

    In light of these changes, the Know, Prevent, Fix framework proved prescient: beyond just the increased discussion about open source security, we’re witnessing real progress in the industry to act on those discussions. In particular, the OpenSSF has become a community town hall for driving security engineering efforts, discussions, and industry-wide collaboration.

    These successes have also surfaced new challenges, though, and we believe the next step is to increase accessibility. Security tools should be more easily adopted into common developer workflows, more integrated across the ecosystem, and simpler to connect into projects. Underlying all of this is a need to streamline the process of matching projects with available funds and resources to enable security improvements.

    This follow-up blog post discusses Google’s efforts in collaboration with the open source community to make progress on security goals in the past year, the lessons learned and how the industry can build on this momentum.

    Know

    Our goals for “Know” were to capture more precise data about vulnerabilities, establish a standard schema to track vulnerabilities across databases, and create tooling for better tracking of dependencies.

    Over the past year the community’s investment in Open Source Vulnerabilities (OSV) has resulted in a new vulnerability format. The format was developed and adopted by several open source ecosystems (Python, Rust, Go), as well as vulnerability databases such as GitHub’s Security Advisories (GHSA) and the Global Security Database. Google also worked closely with MITRE on the new CVE 5.0 JSON schema to simplify future interoperability. OSV.dev also supports a searchable vulnerability database that, thanks to the standardized format, aggregates vulnerabilities from all other databases into one easily searched location.

    During the Log4j vulnerability response, the Google-supported Open Source Insights project helped the community understand the impact of the vulnerability. This project analyzes open source packages and provides detailed graphs of dependencies and their properties. With this information, developers can understand how their software is put together and the consequences to changes in their dependencies—which, as Log4j showed, can be severe when affected dependencies are many layers deep in the dependency graph. Today, we’re also making the data powering Open Source Insights available as a public Google Cloud Dataset.

    The OSV project showed that connecting a CVE to the vulnerability patch development workflow can be difficult without precise vulnerability metadata. It will take cooperation across disparate development communities to reap the full benefits of the progress, but with collaboration OSV can scale quickly across language and project ecosystems.

    We believe the next major goal is to lower the barrier of entry for users. Integrating with developer tools and processes will bring high quality information to where it is most useful. For instance, OSV findings can be everywhere from code editors (e.g., when deciding whether to include a library) to deployment (e.g., stopping vulnerable workloads from deploying).

    Prevent

    “Prevent” was conceived to help users understand the risks of new dependencies so they can make informed decisions about the packages and components they consume.

    We’ve seen strong community involvement in the prevention of vulnerabilities, particularly in the Security Scorecards project. Scorecards evaluates a project’s adherence to security best practice and assigns scores that developers can consult before consuming a dependency. Users can choose to avoid projects that, for example, don’t use branch protection settings or employ dangerous workflows (which make projects vulnerable to malicious commits), and gravitate to projects that follow strong security practices like signing their releases and using fuzzing. Thanks to contributions from Cisco, Datto and several other open source contributors, there are now regular Scorecard scans of 1 million projects, and Scorecards has developed from a command line tool into an automated GitHub Actions that runs after any change to GitHub project. More organizations are adopting Scorecards, and the Scorecard GitHub Action has been installed on over 1000 projects, with continued growth. With increased adoption, overall security will improve across entire ecosystems.

    Additionally, Sigstore is helping prevent attacks by creating new tools for signing, verifying and protecting software. Recently, Kubernetes announced that it is using sigstore to sign its releases, showing that artifact signing at a large scale is now within reach. As adoption expands, we can expect stronger links between published source code and the binaries that run it.

    Community collaborators like Citi, Chainguard, DataDog, VMWare and others have actively contributed to the OpenSSF’s SLSA framework. This project is based on Google’s internal Binary Authorization for Borg (BAB), which for more than a decade has been mitigating the risk of source and production attacks at Google. SLSA lays out an actionable path for organizations to increase their overall software supply-chain security by providing step-by-step guidelines and practical goals for protecting source and build system integrity. The SLSA framework addresses a limitation of Software Bills of Materials (SBOMs), which on their own do not provide sufficient information about integrity and provenance. An SBOM created using SLSA provenance and metadata is more complete and addresses both source code and build threat vectors. Using SLSA may also help users implement Secure Software Development Framework (SSDF) requirements.

    Continued improvements to the OSS-Fuzz service for open source developers have helped get over 2300 vulnerabilities fixed across 500+ projects in the past year. Google has also been heavily investing in expanding the scope of fuzzing through adding support for new languages such as Java and Swift and developing bug detectors to find issues like Log4shell.

    Through the Linux Kernel Self-Protection Project, Google has been providing a steady stream of changes to overhaul internal kernel APIs so that the compiler can detect and stop buffer overflows in fragile areas that have seen repeated vulnerabilities. For everyone in the ecosystem staying current on Linux kernel versions, this removes a large class of flaws that could lead to security exploits.

    Looking ahead, this area’s rapid growth highlights the community’s concern about integrity in software supply chains. Users are searching for solutions that they can trust across ecosystems, such as provenance metadata that connects deployed software to its original source code. Additionally, we expect increased scrutiny of development processes to ensure that software is built in the most secure way possible.

    The next goals for open source software security should involve broad adoption of best practices and scalability. Increasing the use of these tools will multiply the positive effects as more projects become secured, but adoption needs to happen in a scalable way across ecosystems (e.g., via the OpenSSF Securing Package Repositories Working Group focused on improving security in centralized package managers). Education will be a driving force to speed the shift from project-by-project adoption to broadscale ecosystem conversion: greater awareness will bring greater momentum for change.

    Fix

    “Fix” was conceived to help users understand their options to remove vulnerabilities, enable notifications that help speed repairs, and fix widely used versions of affected software, not just the most recent versions.

    Google supported open source innovation, security, collaboration, and sustainability through our programs and services by giving $15 million to open source last year. This includes $7.5 million to targeted security efforts in areas such as supply chain security, fuzzing, kernel security and critical infrastructure security. For example, $2.5 million of the security funding went to the Alpha-Omega project, which made its first grant to the Node.js foundation to strengthen its security team and support vulnerability remediation.

    Other security investments include $1 million to SOS Rewards, and $300,000 to the Internet Security Research Group to improve memory safety by incorporating Rust into the Linux kernel. The remaining funding supports security audits, package signing, fuzzing, reproducible builds, infrastructure security and security research.

    Beyond financial investments, Google employees contribute their hours, effort, and code to tens of thousands of open source repositories each year. One issue frequently cited by open source maintainers is limited time. Since under-maintained, critical open source components are a security risk, Google is starting a new Open Source Maintenance Crew, a dedicated staff of Google engineers who will work closely with upstream maintainers on improving the security of critical open source projects. We hope that other enterprises that rely on open source will invest in similar efforts to help accelerate security improvements in the open source ecosystem.

    Up Next

    The amount of progress in the past year is very encouraging: we as an industry have come together to discuss, fund, and make headway on many of the difficult problems that affect us all. The solutions are not just being talked about, but also built, refined, and applied. Now we need to magnify this progress by integrating these solutions with tooling and language ecosystems: every open source developer should have effortless access to end-to-end security by default.

    Google is committed to continuing our work with the OpenSSF to achieve these goals. To learn more about the OpenSSF foundation and join its efforts, check it out here.

  • YouTube receives brand safety distinction for second year Thu, 12 May 2022 13:00:00 +0000

    At YouTube, we’re committed to protecting our viewers, creators and advertisers. Last year, we became the first digital platform to receive content-level brand safety accreditation from the Media Rating Council (MRC). Today, the MRC has given us that accreditation again, making YouTube the only platform to hold this distinction.[ceb9d8]This is a testament to the investments we’ve made in responsibility, YouTube's top priority.

    “We congratulate Google for this noteworthy achievement,” says George W. Ivie, Executive Director and CEO of the MRC. “Brand safety and suitability are critical issues in today’s digital ad environment, and MRC’s accreditation of YouTube, first granted last year and continued today, remains a landmark achievement in providing marketers with strong assurances that their advertising investments on the YouTube platform are being well protected.”

    As part of this accreditation, the MRC extensively audited our content review systems, including the machine learning technology that analyzes content uploaded to our platform and the policies that determine which videos on YouTube are eligible to run ads. The MRC auditors also met with our brand safety personnel on site to review our processes and dug into how we protect our global community — including our procedures for evaluating content across different languages. The accreditation also recognized YouTube’s advertiser safety error rate, a metric authorized by the Global Alliance for Responsible Media (GARM) which evaluates the total percentage of ad impressions that run across violative content.

    “We’re thrilled to see YouTube take another industry-leading step in their continued accreditation with MRC this year,” says Robert Rakowitz, Initiative Lead, GARM. “With this latest certification, YouTube fulfills a key request from advertisers and agencies in having an audit oversight body approve a core metric on the safety of their monetization practices. This is a step to celebrate and a further demonstration of YouTube’s commitment to GARM’s mission.”

    Our continued accreditation confirms that our strategy and systems are keeping pace with the current environment. And it builds on our commitment to remaining at least 99% effective at ensuring brand safety of advertising placements on YouTube, in accordance with industry standards.

    In addition to working with the MRC and GARM to raise the bar on brand safety, we’re also improving brand suitability. Over the past two years, we’ve worked directly with advertisers and agencies to better understand their needs and develop a set of best practices, such as anchoring on YouTube’s inventory modes and reassessing whether they should exclude certain types of content. When advertisers knew how to better navigate our suitability controls, they experienced performance benefits ranging from increased reach and view-through rates to decreased cost-per-view.

    We’re now using these best practices and customer feedback to evolve our suitability offering. This will include intuitive controls, more consistency across all Google inventory and clarity on how controls may impact ad campaigns. We’ll share more details in the coming months.

    “Better suitability controls allow advertisers to access and support more diverse content and audiences in a brand-safe way,” says Luis Di Como, EVP, Global Media, Unilever. “Unilever has long championed a responsible and safe online environment, and we are encouraged by YouTube’s commitment to create a positive digital ecosystem that is safe and inclusive for all.”

    By extending the rigor of our brand safety systems to our suitability solutions, we hope to continue to help advertisers tap into the full scale and potential of YouTube.

  • New ways to stay connected and entertained in your car Thu, 12 May 2022 12:00:00 +0000

    Our work in cars has always been guided by our goal to help make your driving experience easier and safer. Today, we’re introducing several updates for cars compatible with Android Auto and cars with Google built-in to help you stay connected and entertained while enhancing your experience on the road.

    A brand-new look for Android Auto

    Since it first launched, Android Auto has expanded to support more than 150 million cars across nearly every car brand. And over the years, we’ve found there are three main functionalities that drivers prioritize in their cars: navigation, media and communication. This summer, Android Auto will roll out a brand new interface that will help you get directions faster, control your media more easily and have more functionality at your fingertips.

    Car dashboard with display showcasing new Android Auto design in different screen sizes

    Built to adapt to any screen size

    With split screen mode, now standard across all screen types and sizes, you’ll have access to your most-used features all in one place — no need to return to your home screen or scroll through a list of apps. With your navigation and media always on, you won’t have to worry about missing your next turn while changing your favorite commute podcast. And with the new design able to adapt to different screen sizes, it looks great across widescreen, portrait and more.

    New features for Android Auto

    Google Assistant is bringing contextual suggestions to help you be more productive in the car. From suggested replies, to messages, to sharing arrival times with a friend, or even playing recommended music, Google Assistant is helping you do more in the car efficiently.

    In addition to using your voice, you can now quickly message and call favorite contacts with just one tap, and reply to messages by simply selecting a suggested response on the screen – helping you communicate effectively, while allowing you to keep your eyes on the road. Keep an eye out for these updates to Android Auto in the coming months

    Stay connected and entertained with Google built-in

    Cars with Google built-in often come with large displays, and we’re continuing to build new experiences for those displays while your car is parked. We previously announced we’re bringing YouTube to cars with Google built-in and more video streaming apps will join the queue, including Tubi and Epix Now. So, when you’re parked waiting for your car to charge or at curbside pickup, you’ll be able to enjoy video directly from your car display.

    As we work to add more capabilities to cars with Google built-in in the future, you’ll be able to not only browse the web directly from your car display, but also cast your own content from your phone to your car screen.

    Car dashboard with display showcasing Tubi

    Enjoy video content directly from your car’s screen while parked

    Across Android Auto and cars with Google built-in, we’re working hard to ensure every drive is a helpful and connected experience.

  • 100 things we announced at I/O Wed, 11 May 2022 23:00:00 +0000

    And that’s a wrap on I/O 2022! We returned to our live keynote event, packed in more than a few product surprises, showed off some experimental projects and… actually, let’s just dive right in. Here are 100 things we announced at I/O 2022.

    Gear news galore

    Pixel products grouped together on a white background. Products include Pixel Bud Pro, Google Pixel Watch and Pixel phones.
    1. Let’s start at the very beginning — with some previews. We showed off a first look at the upcoming Pixel 7 and Pixel 7 Pro[1ac74e], powered by the next version of Google Tensor
    2. We showed off an early look at Google Pixel Watch! It’s our first-ever all-Google built watch: 80% recycled stainless steel[ec662b], Wear OS, Fitbit integration, Assistant access…and it’s coming this fall.
    3. Fitbit is coming to Google Pixel Watch. More experiences built for your wrist are coming later this year from apps like Deezer and Soundcloud.
    4. Later this year, you’ll start to see more devices powered with Wear OS from Samsung, Fossil Group, Montblanc and others.
    5. Google Assistant is coming soon to the Samsung Galaxy Watch 4 series.
    6. The new Pixel Buds Pro use Active Noise Cancellation (ANC), a feature powered by a custom 6-core audio chip and Google algorithms to put the focus on your music — and nothing else.
    7. Silent Seal™ helps Pixel Buds Pro adapt to the shape of your ear, for better sound. Later this year, Pixel Buds Pro will also support spatial audio to put you in the middle of the action when watching a movie or TV show with a compatible device and supported content.
    8. They also come in new colors: Charcoal, Fog, Coral and Lemongrass. Ahem, multiple colors — the Pixel Buds Pro have a two-tone design.
    9. With Multipoint connectivity, Pixel Buds Pro can automatically switch between your previously paired Bluetooth devices — including compatible laptops, tablets, TVs, and Android and iOS phones.
    10. Plus, the earbuds and their case are water-resistant[a53326].
    11. …And you can preorder them on July 21.
    12. Then there’s the brand new Pixel 6a, which comes with the full Material You experience.
    13. The new Pixel 6a has the same Google Tensor processor and hardware security architecture with Titan M2 as the Pixel 6 and Pixel 6 Pro.
    14. It also has two dual rear cameras — main and ultrawide lenses.
    15. You’ve got three Pixel 6a color options: Chalk, Charcoal and Sage. The options keep going if you pair it with one of the new translucent cases.
    16. It costs $449 and will be available for pre-order on July 21.
    17. We also showed off an early look at the upcoming Pixel tablet[a12f26], which we’re aiming to make available next year.

    Android updates

    18. In the last year, over 1 billion new Android phones have been activated.

    19. You’ll no longer need to grant location to apps to enable Wi-Fi scanning in Android 13.

    20. Android 13 will automatically delete your clipboard history after a short time to preemptively block apps from seeing old copied information

    21. Android 13’s new photo picker lets you select the exact photos or videos you want to grant access to, without needing to share your entire media library with an app.

    22. You’ll soon be able to copy a URL or picture from your phone, and paste it on your tablet in Android 13.

    23. Android 13 allows you to select different language preferences for different apps.

    24. The latest Android OS will also require apps to get your permission before sending you notifications.

    25. And later this year, you’ll see a new Security & Privacy settings page with Android 13.

    26. Google’s Messages app already has half a billion monthly active users with RCS, a new standard that enables you to share high-quality photos, see type indicators, message over Wi-Fi and get a better group messaging experience.

    27. Messages is getting a public beta of end-to-end encryption for group conversations.

    28. Early earthquake warnings are coming to more high-risk regions around the world.

    29. On select headphones, you’ll soon be able to automatically switch audio between the devices you’re listening on with Android.

    30. Stream and use messaging apps from your Android phone to laptop with Chromebook’s Phone Hub, and you won’t even have to install any apps.

    31. Google Wallet is here! It’s a new home for things like your student ID, transit tickets, vaccine card, credit cards, debits cards.

    32. You can even use Google Wallet to hold your Walt Disney World park pass.

    33. Google Wallet is coming to Wear OS, too.

    34. Improved app experiences are coming for Android tablets: YouTube Music, Google Maps and Messages will take advantage of the extra screen space, and more apps coming soon include TikTok, Zoom, Facebook, Canva and many others.

    Developer deep dive

    Illustration depicting a smart home, with lights, thermostat, television, screen and mobile device.

    35. The Google Home and Google Home Mobile software developer kit (SDK) for Matter will be launching in June as developer previews.

    36. The Google Home SDK introduces Intelligence Clusters, which make intelligence features like Home and Away, available to developers.

    37. Developers can even create QR codes for Google Wallet to create their own passes for any use case they’d like.

    38. Matter support is coming to the Nest Thermostat.

    39. The Google Home Developer Center has lots of updates to check out.

    40. There’s now built-in support for Matter on Android, so you can use Fast Pair to quickly connect Matter-enabled smart home devices to your network, Google Home and other accompanying apps in just a few taps.

    41. The ARCore Geospatial API makes Google Maps’ Live View technology available to developers for free. Companies like Lime are using it to help people find parking spots for their scooters and save time.

    42. DOCOMO and Curiosity are using the ARCore Geospatial API to build a new game that lets you fend off virtual dragons with robot companions in front of iconic Tokyo landmarks, like the Tokyo Tower.

    43. AlloyDB is a new, fully-managed PostgreSQL-compatible database service designed to help developers manage enterprise database workloads — in our performance tests, it’s more than four times faster for transactional workloads and up to 100 times faster for analytical queries than standard PostgreSQL.

    44. AlloyDB uses the same infrastructure building blocks that power large-scale products like YouTube, Search, Maps and Gmail.

    45. Google Cloud’s machine learning cluster powered by Cloud TPU v4 Pods is super powerful — in fact, we believe it’s the world’s largest publicly available machine learning hub in terms of compute power…

    46. …and it operates at 90% carbon-free energy.

    47. We also announced a preview of Cloud Run jobs, which reduces the time developers spend running administrative tasks like database migration or batch data transformation.

    48. We announced Flutter 3.0, which will enable developers to publish production-ready apps to six platforms at once, from one code base (Android, iOS, Desktop Web, Linux, Desktop Windows and MacOS).

    49. To help developers build beautiful Wear apps, we announced the beta of Jetpack Compose for Wear OS.

    50. We’re making it faster and easier for developers to build modern, high-quality apps with new Live edit features in Android Studio.

    Help for the home

    GIF of a man baking cookies with a speech bubble saying “Set a timer for 10 minutes.” His Google Nest Hub Max responds with a speech bubble saying “OK, 10 min. And that’s starting…now.”

    51. Many Nest Devices will become Matter controllers, which means they can serve as central hubs to control Matter-enabled devices both locally and remotely from the Google Home app.

    52. Works with Hey Google is now Works with Google Home.

    53. The new home.google is your new hub for finding out everything you can do with your Google Home system.

    54. Nest Hub Max is getting Look and Talk, where you can simply look at your device to ask a question without saying “Hey Google.”

    55. Look and Talk works when Voice Match and Face Match recognize that it’s you.

    56. And video from Look and Talk interactions is processed entirely on-device, so it isn’t shared with Google or anyone else.

    57. Look and Talk is opt-in. Oh, and FYI, you can still say “Hey Google” whenever you want!

    58. Want to learn more about it? Just say “Hey Google, what is Look and Talk?” or “Hey Google, how do you enable Look and Talk?”

    59. We’re also expanding quick phrases to Nest Hub Max, so you can skip saying “Hey Google” for some of your most common daily tasks – things like “set a timer for 10 minutes” or “turn off the living room lights.”

    60. You can choose the quick phrases you want to turn on.

    61. Your quick phrases will work when Voice Match recognizes it’s you .

    62. And looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.

    Taking care of business

    Animated GIF  demonstrating portrait light, bringing studio-quality lighting effects to Google Meet.

    63. Google Meet video calls will now look better thanks to portrait restore and portrait light, which use AI and machine learning to improve quality and lighting on video calls.

    64. Later this year we’re scaling the phishing and malware protections that guard Gmail to Google Docs, Sheets and Slides.

    65. Live sharing is coming to Google Meet, meaning users will be able to share controls and interact directly within the meeting, whether it’s watching an icebreaker video from YouTube or sharing a playlist.

    66. Automated built-in summaries are coming to Spaces so you can get a helpful digest of conversations to catch up quickly.

    67. De-reverberation for Google Meet will filter out echoes in spaces with hard surfaces, giving you conference-room audio quality whether you’re in a basement, a kitchen, or a big empty room.

    68. Later this year, we're bringing automated transcriptions of Google Meet meetings to Google Workspace, so people can catch up quickly on meetings they couldn't attend.

    Apps for on-the-go

    A picture of London in immersive view.

    69. Google Wallet users will be able to check the balance of transit passes and top up within Google Maps.

    70. Google Translate added 24 new languages.

    71. As part of this update, Indigenous languages of the Americas (Quechua, Guarani and Aymara) and an English dialect (Sierra Leonean Krio) have also been added to Translate for the first time.

    72. Google Translate now supports a total of 133 languages used around the globe.

    73. These are the first languages we’ve added using Zero-resource Machine Translation, where a machine learning model only sees monolingual text — meaning, it learns to translate into another language without ever seeing an example.

    74. Google Maps’ new immersive view is a whole new way to explore so you can see what an area truly looks and feels like.

    75. Immersive view will work on nearly any phone or tablet; you don’t need the fanciest or newest device.

    76. Immersive view will first be available in L.A., London, New York, San Francisco and Tokyo — with more places coming soon.

    77. Last year we launched eco-friendly routing in the U.S. and Canada. Since then, people have used it to travel 86 billion miles, which saved more than half a million metric tons of carbon emissions — that’s like taking 100,000 cars off the road.

    78. And we’re expanding eco-friendly routing to more places, like Europe.

    All in on AI

    Ten circles in a row, ranging from dark to light.

    The 10 shades of the Monk Skin Tone Scale.

    79. A team at Google Research partnered with Harvard’s Dr. Ellis Monk to openly release the Monk Skin Tone Scale, a new tool for measuring skin tone that can help build more inclusive products.

    80. Google Search will use the Monk Skin Tone Scale to make it easier to find more relevant results — for instance, if you search for “bridal makeup,” you’ll see an option to filter by skin tone so you can refine to results that meet your needs.

    81. Oh, and the Monk Skin Tone Scale was used to evaluate a new set of Real Tone filters for Photos that are designed to work well across skin tones. These filters were created and tested in partnership with artists like Kennedi Carter and Joshua Kissi.

    82. We’re releasing LaMDA 2, as a part of the AI Test Kitchen, a new space to learn, improve, and innovate responsibly on this technology together.

    83. PaLM is a new language model that can solve complex math word problems, and even explain its thought process, step-by-step.

    84. Nest Hub Max’s new Look and Talk feature uses six machine learning models to process more than 100 signals in real time to detect whether you’re intending to make eye contact with your device so you can talk to Google Assistant and not just giving it a passing glance.

    85. We recently launched multisearch in the Google app, which lets you search by taking a photo and asking a question at the same time. At I/O, we announced that later this year, you'll be able to take a picture or screenshot and add "near me" to get local results from restaurants, retailers and more.

    86. We introduced you to an advancement called “scene exploration,” where in the future, you’ll be able to use multisearch to pan your camera and instantly glean insights about multiple objects in a wider scene.

    Privacy, security and information

    A GIF that shows someone’s Google account with a yellow alert icon, flagging recommended actions they should take to secure their account.

    87. We’ve expanded our support for Project Shield to protect the websites of 200+ Ukrainian government agencies, news outlets and more.

    88. Account Safety Status will add a simple yellow alert icon to flag actions you should take to secure your Google Account.

    89. Phishing protections in Google Workspace are expanding to Docs, Slides and Sheets.

    90. My Ad Center is now giving you even more control over the ads you see on YouTube, Search, and your Discover feed.

    91. Virtual cards are coming to Chrome and Android this summer, adding an additional layer of security and eliminating the need to enter certain card details at checkout.

    92. In the coming months, you’ll be able to request removal of Google Search results that have your contact info with an easy-to-use tool.

    93. Protected Computing, a toolkit that helps minimize your data footprint, de-identifies your data and restricts access to your sensitive data.

    94. On-device encryption is now available for Google Password Manager.

    95. We’re continuing to auto enroll people in 2-Step Verification to reduce phishing risks.

    What else?!

    Illustration of a black one-story building with large windows. Inside are people walking around wooden tables and white walls containing Google hardware products. There is a Google Store logo on top of the building.

    96. A new Google Store is opening in Williamsburg.

    97. This is our first “neighborhood store” — it’s in a more intimate setting that highlights the community. You can find it at 134 N 6th St., opening on June 16.

    98. The store will feature an installation by Brooklyn-based artist Olalekan Jeyifous.

    99. Visitors there can picture everyday life with Google products through interactive displays that show how our hardware and services work together, and even get hands-on help with devices from Google experts.

    100. We showed a prototype of what happens when we bring technologies like transcription and translation to your line of sight.

  • Make connections that Matter in Google Home Wed, 11 May 2022 22:15:00 +0000

    We’re entering a new era of the smart home built on openness and collaboration — one where you should have no problem using devices from different smart home brands to turn on your lights, warm up your living room and set your morning alarm. All of them should work together in harmony.

    Matter, the new smart home industry standard we developed with other leading technology companies, is making this possible. Whether you’re shopping for or building your own smart home devices, let’s take a closer look at how Matter can help you make more connections with Google products and beyond when it launches later this year.

    Connect your favorite smart home brands

    When you buy a Matter-enabled device, the set-up process will be quick and consistent. In just a few taps, you can easily link it to your home network, another smart home ecosystem and your favorite apps. Support for Matter through Fast Pair on Android makes it as easy as connecting a new pair of headphones. And because Matter devices connect and communicate locally over Wi-Fi and Thread, a wireless mesh networking technology, they’re more reliable and responsive — reducing lag and potential connection interruptions.

    To help you get ready for Matter, we’ll update many Google Nest devices to be Matter controllers. This will let you connect all your Matter-enabled devices to Google Home, and control them both locally and remotely with the Google Home app, smart home controls on your Android phone or Google Assistant. Matter controllers will include the original Google Home speaker, Google Mini, Nest Mini, Nest Hub (1st and 2nd gen), Nest Hub Max, Nest Audio and Nest Wifi.

    Meanwhile, Nest Wifi, Nest Hub Max and Nest Hub (2nd gen) will also serve as Thread border routers, allowing you to connect devices built with Thread — like backyard lights that need long-range connectivity — to your home network.

    We’ve also rolled out a new Google Home site to help you explore everything you can do with your Google Home in one spot. You can discover thousands of smart home devices that work with Google Home and learn how to get the most out of your helpful home — including automated routines to make everyday life easier, safer and more convenient.

    To make it easier to find products that work great with Google Home, we're updating our “Works with” partner program. Works with Hey Google is now Works with Google Home. Partner devices that carry this badge have gone the extra mile to build high-quality experiences with Google using Matter or our existing integrations. It’ll take some time for all our partners to start using the new badge — but if you spot either of these badges on a smart home product, you’ll know they easily connect with Google and our home control features like routines, voice control through Google Assistant devices and Android home controls.

    Build more connected smart home devices

    Developers, take note: With Matter, there’s no need to build multiple versions of a smart home device to work across different ecosystems. You’ll only have to build once, and that device will work right away with Google Home and other smart home brands. This means you can spend less time building multiple connectivity paths, and more time innovating and delivering devices and features.

    To help you do that, we’ve launched a new Google Home Developer Center that brings together all our resources for developers and businesses. You can start learning today how to build smart home devices and Android apps with Matter, discover new features to integrate into your devices and explore marketing resources to help grow your business. You’ll also find new community support tools for device makers building with Google Home.

    On June 30, we’ll launch the Google Home Developer Console, including two new software development kits (SDKs) to make it easy to build Matter devices and apps. The Google Home Device SDK is the fastest way to start building Matter devices. This SDK will also introduce Intelligence Clusters, which will share Google Intelligence — starting with Home & Away Routines — with developers who meet certain security and privacy requirements.

    The new Google Home Mobile SDK will make it easy to build apps that connect directly with Matter devices using new built-in connectivity support in Android. This makes the set-up process simpler, more consistent and reliable for Android users. And with connectivity taken care of, developers can spend more time building unique features and experiences.

    We can’t wait to see how you use Matter, Thread and Google Home to build and create the smart home experience that best suits you. Check out home.google.com and developers.home.google.com to learn more and sign up for future updates.

  • Take a look at our new Pixel portfolio, made to be helpful Wed, 11 May 2022 19:00:00 +0000

    From phones and smartwatches to tablets and laptops — our day-to-day lives can be filled with so many devices, and dealing with them should be easy. This is why we’re focused on building hardware and software that work together to anticipate and react to your requests, so you don’t have to spend time fussing with technology.

    To bring this vision to life, we’ve spent years focusing on ambient computing and how it can help us build technology that fades into the background, while being more useful than ever. Today at I/O, I shared several important updates to our hardware portfolio that lay the groundwork for creating a family of devices that not only work better together, but work together for you.

    Meet the new Pixel portfolio

    We’ve thoughtfully designed the Pixel portfolio so the helpfulness and intelligence of Google can adapt to you in a non-intrusive way. This is all possible thanks to multi-device work from the Android team combined with our work to layer cutting-edge AI research and helpful software and services onto our devices. And of course, we always tightly integrate powerful data security directly into our hardware.

    Pixel products grouped together on a white background. Products include Pixel Bud Pros, the Google Pixel Watch and Pixel phones.

    Last year we launched Google Tensor, our first custom-designed mobile system on a chip (SoC), to create a common platform for our Pixel phones. The first Pixels built with Tensor, Pixel 6 and Pixel 6 Pro, are the fastest selling Pixel phones to date. And today we introduced the new Pixel 6a, which has the same Tensor processor and industry-leading security from our Titan M2 chip.

    Our Pixel Buds are designed to perfectly complement your Pixel phone, and we’re excited to expand the earbuds offerings with Pixel Buds Pro. These premium earbuds include a new, custom 6-core audio chip that runs Google-developed algorithms — all tuned by our in-house audio engineering team.

    A sneak peek of what’s to come

    Building on our ambient computing vision, we’re focused on how Pixel devices can be even more helpful to you — now and in the future. Today, we gave a preview of our new Google Pixel Watch — the first watch we’ve built inside and out. It has a bold circular, domed design, a tactile crown, recycled stainless steel and customizable bands that easily attach. With this watch, you’ll get the new Wear OS by Google experience and Fitbit’s industry-leading health and fitness tools — right on your wrist. Google Pixel Watch is a natural extension of the Pixel family, providing help whenever and wherever you need it. It will be available this fall, and we’ll share more details in the coming months.

    Animated GIF showing the Google Pixel Watch with a white band.

    We also previewed our Pixel 7 phones, coming this fall.[42f7f0]Our next version of Google Tensor will power these devices, which are built for those who want the latest technology and fastest performance.

    And finally, we shared an early look at our Android tablet, powered by Google Tensor.[a9d69b]Built to be the perfect companion for your Pixel phone, our tablet will blend into your day-to-day routine and help connect the moments you’re on the go with the moments you’re at home. We hope to have more to share here in 2023, so stay tuned.

    We’re building out the Pixel portfolio to give you more options for varying budgets and needs. I can’t wait for everyone to see for themselves how helpful these devices and technology can be — from wearables, phones and tablets to audio and smart home technology. And if you’re headed to the New York area, you can see these devices in action at our second Google Store that’s opening this summer in Brooklyn.

  • Loud and clear, Pixel Buds Pro are here Wed, 11 May 2022 18:55:00 +0000

    Have you heard? Google Pixel Buds Pro are here. These premium wireless earbuds with Active Noise Cancellation bring you full, immersive sound — now that’s music to our ears. Pixel Bud Pros are built to work great across our full Pixel portfolio and with other Android phones, and they’re packed with all the helpfulness and smarts you expect from Google.[b9fb78]You can pre-order Pixel Buds Pro on July 21 for $199.

    Immersive sound that adapts to you

    Great art starts with a blank canvas, and it’s no different with sound. To set the foundation for your music to shine without distractions, Pixel Buds Pro use Active Noise Cancellation (ANC). We built our ANC with a custom 6-core audio chip that runs Google-developed algorithms — all tuned by our in-house audio engineering team — and custom speakers.

    Pixel Buds Pro use a custom-built 6-core audio chip for Active Noise Cancellation.

    Everyone’s ears are unique, so it’s not always possible for the eartips to create a perfect seal that prevents sound from leaking in from the outside. Pixel Buds Pro use Silent Seal™ to adapt to your ear, to help maximize the amount of noise that’s canceled. And built-in sensors will measure the pressure in your ear canal to make sure you’re comfortable even during long listening sessions. Say goodbye to that annoying plugged ear feeling!

    Once you’re listening to your music or podcast, Volume EQ will adjust the tuning as you turn the volume up or down — so highs, mids and lows consistently sound balanced. Later this year, Pixel Buds Pro will also support spatial audio. So when you watch a spatial audio-supported movie or TV show on compatible Pixel phones, you’ll feel like you're in the middle of the action.

    As versatile as you are

    Pixel Buds Pro adapt throughout your day by anticipating your next move. If you end a video call on your laptop to head out on a walk and listen to music, you won’t need to fumble around with Bluetooth menus. With Multipoint connectivity, Pixel Buds Pro can automatically switch between your previously paired Bluetooth devices — including laptops, tablets, TVs, and Android and iOS phones.

    Once you’re on that walk, Pixel Buds Pro will help you place clear calls even if it's loud and windy outside. And of course, Google Assistant is there to give you hands-free help. Just say “Hey Google,” and ask the Assistant for whatever you need — like walking directions or even real-time translation in 40 languages.

    Want to stay aware of your surroundings? Transparency mode lets ambient noise in so you can hear what’s going on around you — perfect for crossing a busy street, waiting for your order at a cafe or walking around town.

    A close up shot of a person in athletic gear wearing Pixel Buds Pro in their ear.

    And if you’re sweating through an intense workout or jogging in light rain, your new Pixel Buds Pro have you covered. The earbuds have IPX4 water resistance, and the case is IPX2 water resistant.[9f4d9e]

    Designed to look good and last throughout your day

    Pixel Buds Pro are built to suit your lifestyle and look just as good as they sound. They come in a soft matte finish and a two-tone design. Pick from four color options: Coral, Lemongrass, Fog and Charcoal.

    Pixel Buds Pro come in four colors: Coral, Lemongrass, Fog and Charcoal.

    No matter what you’re doing, you can trust they’ll get you through your day. Pixel Buds Pro charge wirelessly and give you up to 11 hours of listening time or up to 7 hours with Active Noise Cancellation turned on, so rest assured you can tune out the noise on that long flight.[0692a3]

  • Pixel 6a: More of what you want for less than you expect Wed, 11 May 2022 18:55:00 +0000

    Our latest A-series phone, Google Pixel 6a, gives you more of what you want — for less than you’d expect. Pixel 6a is packed with the same powerful brains, Google Tensor, and many of the must-have features as our premium phones Pixel 6 and Pixel 6 Pro — at a lower price of $449.

    Designed with you in mind

    Pixel 6a borrows many of the same design elements from Pixel 6 — including the iconic camera bar — along with a metal frame that is durable by design. You’ll also get the updated Material You design UX that lets you personalize the look and feel of your phone, making it truly yours. Show off your colorful side and coordinate your aesthetic with one of three phone colors: Chalk, Charcoal and Sage.

    Image of the phone in Charcoal, Chalk and Sage

    For added protection and even more color options, pick out one of the cases made specifically for Pixel 6a — they're translucent and can be mixed and matched to create unique color combos. You’ll also have your choice of cases from our Made for Google partners.

    Translucent Pixel 6a cases in Carbon, Frosted Clear and Seafoam.

    Fully loaded with the features you love

    From exceptional camera features to speech recognition to security you can trust, many of your favorite features from Pixel 6 and Pixel 6 Pro will be joining the party — thanks to Google Tensor. Here’s a look at some of them.

    Pixel 6a helps capture your most important moments with a Camera Bar that includes dual rear cameras: a main lens and an ultrawide lens. So rest assured you can capture the whole scene. As for the selfie camera on Pixel 6a, it’s the same great camera as Pixel 6.

    The Pixel Camera is built to be versatile and adapt to your needs, and you’ll see some of those features and technologies on Pixel 6a — from Real Tone, which authentically represents all skin tones, to Night Sight, which makes low-light photography a breeze, to Magic Eraser in Google Photos, which makes distractions disappear. And good news, we’ve enhanced Magic Eraser so you can also change the color of distracting objects in your photo. In just a few taps, the object’s colors and shading blend in naturally. So the focus is on the subjects — where it should be.

    Pixel 6a comes with the same highly accurate speech recognition as Pixel 6 Pro. That includes features like Recorder, Live Caption and Live Translate.

    Pixel 6a is your personal translator with Live Translate. Animated GIF showing Live Translate in action.

    With Live Translate, you’ll have a personal translator wherever you go! Find more details and availability here.

    The power and safety of Google Tensor

    You’ll get the full hardware and software experience you’d expect with Google Tensor without compromising on battery life. Pixel 6a comes with an all-day battery that can last up to 72 hours when in the Extreme Battery Saver mode — a first for Pixel phones.[edfc02]With Google Tensor, Pixel 6a shares the same security architecture as Pixel 6 Pro, including our dedicated security chip Titan M2 that gives you the peace of mind that your sensitive data is safe.

    With this common hardware platform across our latest phones, Pixel 6a will receive five years of security updates from when the device first becomes available on GoogleStore.com in the U.S., just like Pixel 6 and Pixel 6 Pro. Plus, Pixel 6a comes with Feature Drops so you get the latest and greatest features and updates. And as with other Pixel devices, Pixel 6a will be among the first Android devices to receive the upcoming Android 13 update.

    Pixel 6a will be available for pre-order starting at $449 on July 21 and on shelves on July 28. Find out what countries Pixel 6a will be available in, and sign up for product updates.

  • Ask a Techspert: How do digital wallets work? Wed, 11 May 2022 18:17:00 +0000

    In recent months, you may have gone out to dinner only to realize you left your COVID vaccine card at home. Luckily, the host is OK with the photo of it on your phone. In this case, it’s acceptable to show someone a picture of a card, but for other things it isn’t — an image of your driver’s license or credit card certainly won’t work. So what makes digital versions of these items more legit than a photo? To better understand the digitization of what goes into our wallets and purses, I talked to product manager Dong Min Kim, who works on the brand new Google Wallet. Google Wallet, which will be coming soon in over 40 countries, is the new digital wallet for Android and Wear OS devices…but how does it work?

    Let’s start with a basic question: What is a digital wallet?

    A digital wallet is simply an application that holds digital versions of the physical items you carry around in your actual wallet or purse. We’ve seen this shift where something you physically carry around becomes part of your smartphone before, right?

    Like..?

    Look at the camera: You used to carry around a separate item, a camera, to take photos. It was a unique device that did a specific thing. Then, thanks to improvements in computing power, hardware and image processing algorithms, engineers merged the function of the camera — taking photos — into mobile phones. So now, you don’t have to carry around both, if you don’t want to.

    Ahhh yes, I am old enough to remember attending college gatherings with my digital camera andmy flip phone.

    Ha! So think about what else you carry around: your wallet and your keys.

    So the big picture here is that digital wallets help us carry around less stuff?

    That’s certainly something we’re thinking about, but it’s more about how we can make these experiences — the ones where you need to use a camera, or in our case, items from your wallet — better. For starters, there’s security: It's really hard for someone to take your phone and use your Google Wallet, or to take your card and add it to their own phone. Your financial institution will verify who you are before you can add a card to your phone, and you can set a screen lock so a stranger can’t access what’s on your device. And should you lose your device, you can remotely locate, lock or even wipe it from “Find My Device.”

    What else can Google Wallet do that my physical wallet can’t?

    If you saved your boarding pass for a flight to Google Wallet, it will notify you of delays and gate changes. When you head to a concert, you’ll receive a notification on your phone beforehand, reminding you of your saved tickets.

    Wallet also works with other Google apps — for instance if you’re taking the bus to see a friend and look up directions in Google Maps, your transit card and balance will show up alongside the route. If you're running low on fare, you can tap and add more. We’ll also give you complete control over how items in your wallet are used to enable these experiences; for example, the personal information on your COVID vaccine pass is kept on your device and never shared without your permission, not even with Google.

    Plus, even if you lose your credit or debit card and you’re waiting for the replacement to show up, you can still use that card with Google Wallet because of the virtual number attached to it.

    This might be taking a step backwards, but can I pay someone from my Google Wallet? As in can I send money from a debit card, or straight from my bank account?

    That’s actually where the Google Pay app — which is available in markets like the U.S., India and Singapore — comes in. We’ll keep growing this app as a companion app where you can do more payments-focused things like send and receive money from friends or businesses, discover offers from your favorite retailers or manage your transactions.

    OK, but can I pay with my Google Wallet?

    Yes,you can still pay with the cards stored in your Google Wallet in stores where Google Pay is accepted; it’s simple and secure.

    Use payment cards in Google Wallet in stores with Google Pay, got it — but how does everything else “get” into Wallet?

    We've already partnered with hundreds of transit agencies, retailers, ticket providers, health agencies and airlines so they can create digital versions of their cards or tickets for Google Wallet. You can add a card or ticket directly to Wallet, or within the apps or sites of businesses we partner with, you’ll see an option to add it to Wallet. We’re working on adding more types of content for Wallet, too, like digital IDs, or office and hotel keys.

    An image of the Google Wallet app open on a Pixel phone. The app is showing a Chase Freedom Unlimited credit card, a ticket for a flight from SFO to JFK, and a Walgreens cash reward pass. In the bottom right hand corner, there is a “Add to Wallet” button.

    Developers can make almost any item into a digital pass.. Developers can use the templates we’ve created, like for boarding passes and event tickets — or they can use a generic template if it’s something more unique and we don’t have a specific solution for it yet. This invitation to developers is part of what I think makes Google Wallet interesting; it’s very open.

    What exactly do you mean by “open” exactly?

    Well, the Android platform is open — any Android developer can use and develop for Wallet. One thing that’s great about that is all these features and tools can be made available on less expensive phones, too, so it isn’t only people who can afford the most expensive, newest phones out there who can use Google Wallet. Even if a phone can’t use some features of Google Wallet, it’s possible for developers to use QR or barcodes for their content, which more devices can access.

    So working with Google Wallet is easier for developers. Any ways you’re making things easier for users?

    Plenty of them! In particular, we’re working on ways to make it easy to add objects directly from your phone too. For instance, today if you take a screenshot of your boarding pass or Covid vaccine card from an Android device, we’ll give you the option to add it directly to your Google Wallet!

  • Get the full picture with helpful context on websites Wed, 11 May 2022 18:15:00 +0000

    When you think about how you can stay safe online, you might immediately think of protecting your data, updating your passwords, or having control over your personal information. But another important part of online safety is being confident in the information you find.

    Information quality — in other words, surfacing relevant information from reliable sources — is a key principle of Google Search, and it’s one we relentlessly invest in. We also give you tools to evaluate for yourself the reliability of the information you come across.

    Helpful context on websites

    One of the tools we launched last year, About this Result, has now been used more than 1.6 billion times. This tool is available in English on individual Search results, helping you to see important context about a website before you even visit it. More languages will be available for this tool later this year.

    But we want to ensure you have the tools to evaluate information wherever you are online — not just on the search results page, but also if you’ve already picked a webpage to visit. So we’re making this helpful context more accessible as you explore the web.

    Soon, when you’re viewing a web page on the Google App, you'll be able to see a tab with information about the source with just a tap — including a brief description, what they say about themselves and what others on the web say about them.

    GIF showing the new helpful context feature for websites

    Imagine you’re researching conservation efforts, and find yourself on an unfamiliar website of a rainforest protection organization. Before you decide to donate, you’d like to understand if it’s an organization you feel confident you should support. With this update, you’ll be able to find helpful context about a source while you’re already on a website.

    You’ll be able to see context like this on any website — coming soon to the Google App on iOS and Android.

    We hope this will not only give you more context and peace of mind when you search, but also help you explore with confidence.

  • A new Search tool to help control your online presence Wed, 11 May 2022 18:00:00 +0000

    Have you ever searched for your name online to see what other people can find out about you? You’re not alone. And for many people, a key element of feeling safer and more private online is having greater control over where their sensitive, personally-identifiable information can be found.

    These days, it’s important to have simple tools to manage your online presence. That’s why we’re introducing a new tool in Google Search to help you easily control whether your personally-identifiable information can be found in Search results, so you can have more peace of mind about your online footprint.

    Remove results about you in Search

    You might have seen that we recently updated our policies to enable people to request the removal of sensitive, personally-identifiable information — including contact information, like a phone number, email address, or home address — from Search.

    Now, we’re making it easier for you to remove results that contain your contact information from Google. We’re rolling out a new tool to accompany our updated policies and streamline the request process.

    A gif showing a representation of a new tool that will allow people to easily request the removal of Search results containing their phone number, home address, or email address.

    When you’re searching on Google and find results about you that contain your phone number, home address, or email address, you’ll be able to quickly request their removal from Google Search — right as you find them. With this new tool, you can request removal of your contact details from Search with a few clicks, and you’ll also be able to easily monitor the status of these removal requests.

    This feature will be available in the coming months in the Google App, and you’ll also be able to make removal requests by going to the three dots next to individual Google Search results. In the meantime, you can make requests to remove your info from our support page.

    It’s important to note that when we receive removal requests, we will evaluate all content on the web page to ensure that we're not limiting the availability of other information that is broadly useful, for instance in news articles. And of course, removing contact information from Google Search doesn’t remove it from the web, which is why you may wish to contact the hosting site directly, if you're comfortable doing so.

    At Google, we strongly believe in open access to information, and we also have a deep commitment to protecting people — and their privacy — online. These changes are significant and important steps to help you manage your online presence — and we want to make sure it’s as easy as possible for you to be in control.

  • Improving skin tone representation across Google Wed, 11 May 2022 17:32:00 +0000

    Seeing yourself reflected in the world around you — in real life, media or online — is so important. And we know that challenges with image-based technologies and representation on the web have historically left people of color feeling overlooked and misrepresented. Last year, we announced Real Tone for Pixel, which is just one example of our efforts to improve representation of diverse skin tones across Google products.

    Today, we're introducing a next step in our commitment to image equity and improving representation across our products. In partnership with Harvard professor and sociologist Dr. Ellis Monk, we’re releasing a new skin tone scale designed to be more inclusive of the spectrum of skin tones we see in our society. Dr. Monk has been studying how skin tone and colorism affect people’s lives for more than 10 years.

    The culmination of Dr. Monk’s research is the Monk Skin Tone (MST) Scale, a 10-shade scale that will be incorporated into various Google products over the coming months. We’re openly releasing the scale so anyone can use it for research and product development. Our goal is for the scale to support inclusive products and research across the industry — we see this as a chance to share, learn and evolve our work with the help of others.

    Ten circles in a row, ranging from dark to light.

    The 10 shades of the Monk Skin Tone Scale.

    This scale was designed to be easy-to-use for development and evaluation of technology while representing a broader range of skin tones. In fact, our research found that amongst participants in the U.S., people found the Monk Skin Tone Scale to be more representative of their skin tones compared to the current tech industry standard. This was especially true for people with darker skin tones.

    “In our research, we found that a lot of the time people feel they’re lumped into racial categories, but there’s all this heterogeneity with ethnic and racial categories,” Dr. Monk says. “And many methods of categorization, including past skin tone scales, don’t pay attention to this diversity. That’s where a lack of representation can happen…we need to fine-tune the way we measure things, so people feel represented.”

    Using the Monk Skin Tone Scale to improve Google products

    Updating our approach to skin tone can help us better understand representation in imagery, as well as evaluate whether a product or feature works well across a range of skin tones. This is especially important for computer vision, a type of AI that allows computers to see and understand images. When not built and tested intentionally to include a broad range of skin-tones, computer vision systems have been found to not perform as well for people with darker skin.

    The MST Scale will help us and the tech industry at large build more representative datasets so we can train and evaluate AI models for fairness, resulting in features and products that work better for everyone — of all skin tones. For example, we use the scale to evaluate and improve the models that detect faces in images.

    Here are other ways you’ll see this show up in Google products.

    Improving skin tone representation in Search

    Every day, millions of people search the web expecting to find images that reflect their specific needs. That’s why we’re also introducing new features using the MST Scale to make it easier for people of all backgrounds to find more relevant and helpful results.

    For example, now when you search for makeup related queries in Google Images, you'll see an option to further refine your results by skin tone. So if you’re looking for “everyday eyeshadow” or “bridal makeup looks” you’ll more easily find results that work better for your needs.

    Animated GIF showing a Google Images search for “bridal makeup looks.” The results include an option to filter by skin tone; the cursor selects a darker skin tone, which adjusts to results that are more relevant to this choice.

    Seeing yourself represented in results can be key to finding information that's truly relevant and useful, which is why we’re also rolling out improvements to show a greater range of skin tones in image results for broad searches about people, or ones where people show up in the results. In the future, we’ll incorporate the MST Scale to better detect and rank images to include a broader range of results, so everyone can find what they're looking for.

    Creating a more representative Search experience isn’t something we can do alone, though. How content is labeled online is a key factor in how our systems surface relevant results. In the coming months, we'll also be developing a standardized way to label web content. Creators, brands and publishers will be able to use this new inclusive schema to label their content with attributes like skin tone, hair color and hair texture. This will make it possible for content creators or online businesses to label their imagery in a way that search engines and other platforms can easily understand.

    A photograph of a Black person looking into the camera. Tags hover over various areas of the photo; one over their skin says “Skin tone” with a circle matching their skin tone. Two additional tags over their hair read “Hair color” and “Hair texture.

    Improving skin tone representation in Google Photos

    We’ll also be using the MST Scale to improve Google Photos. Last year, we introduced an improvement to our auto enhance feature in partnership with professional image makers. Now we’re launching a new set of Real Tone filters that are designed to work well across skin tones and evaluated using the MST Scale. We worked with a diverse range of renowned image makers, like Kennedi Carter and Joshua Kissi, who are celebrated for beautiful and accurate depictions of their subjects, to evaluate, test and build these filters. These new Real Tone filters allow you to choose from a wider assortment of looks and find one that reflects your style. Real Tone filters will be rolling out on Google Photos across Android, iOS and Web in the coming weeks.

    Animated video showing before and after photos of images with the Real Tone Filter.

    What’s next?

    We’re openly releasing the Monk Skin Tone Scale so that others can use it in their own products, and learn from this work —and so that we can partner with and learn from them. We want to get feedback, drive more interdisciplinary research, and make progress together. We encourage you to share your thoughts here. We’re continuing to collaborate with Dr. Monk to evaluate the MST Scale across different regions and product applications, and we’ll iterate and improve on it to make sure the scale works for people and use cases all over the world. And, we’ll continue our efforts to make Google’s products work even better for every user.

    The best part of working on this project is that it isn’t just ours — while we’re committed to making Google products better and more inclusive, we’re also excited about all the possibilities that exist as we work together to build for everyone across the web.

  • Google Translate learns 24 new languages Wed, 11 May 2022 17:16:00 +0000

    For years, Google Translate has helped break down language barriers and connect communities all over the world. And we want to make this possible for even more people — especially those whose languages aren’t represented in most technology. So today we’ve added 24 languages to Translate, now supporting a total of 133 used around the globe.

    Over 300 million people speak these newly added languages — like Mizo, used by around 800,000 people in the far northeast of India, and Lingala, used by over 45 million people across Central Africa. As part of this update, Indigenous languages of the Americas (Quechua, Guarani and Aymara) and an English dialect (Sierra Leonean Krio) have also been added to Translate for the first time.

    The Google Translate bar translates the phrase "Our mission: to enable everyone, everywhere to understand the world and express themselves across languages" into different languages.

    Translate's mission translated into some of our newly added languages

    Here’s a complete list of the new languages now available in Google Translate:

    • Assamese, used by about 25 million people in Northeast India
    • Aymara, used by about two million people in Bolivia, Chile and Peru
    • Bambara, used by about 14 million people in Mali
    • Bhojpuri, used by about 50 million people in northern India, Nepal and Fiji
    • Dhivehi, used by about 300,000 people in the Maldives
    • Dogri, used by about three million people in northern India
    • Ewe, used by about seven million people in Ghana and Togo
    • Guarani, used by about seven million people in Paraguay and Bolivia, Argentina and Brazil
    • Ilocano, used by about 10 million people in northern Philippines
    • Konkani, used by about two million people in Central India
    • Krio, used by about four million people in Sierra Leone
    • Kurdish (Sorani), used by about 15 million people in Iraq and Iran
    • Lingala, used by about 45 million people in the Democratic Republic of the Congo, Republic of the Congo, Central African Republic, Angola and the Republic of South Sudan
    • Luganda, used by about 20 million people in Uganda and Rwanda
    • Maithili, used by about 34 million people in northern India
    • Meiteilon (Manipuri), used by about two million people in Northeast India
    • Mizo, used by about 830,000 people in Northeast India
    • Oromo, used by about 37 million people in Ethiopia and Kenya
    • Quechua, used by about 10 million people in Peru, Bolivia, Ecuador and surrounding countries
    • Sanskrit, used by about 20,000 people in India
    • Sepedi, used by about 14 million people in South Africa
    • Tigrinya, used by about eight million people in Eritrea and Ethiopia
    • Tsonga, used by about seven million people in Eswatini, Mozambique, South Africa and Zimbabwe
    • Twi, used by about 11 million people in Ghana

    This is also a technical milestone for Google Translate. These are the first languages we’ve added using Zero-Shot Machine Translation, where a machine learning model only sees monolingual text — meaning, it learns to translate into another language without ever seeing an example. While this technology is impressive, it isn't perfect. And we’ll keep improving these models to deliver the same experience you’re used to with a Spanish or German translation, for example. If you want to dig into the technical details, check out our Google AI blog post and research paper.

    We’re grateful to the many native speakers, professors and linguists who worked with us on this latest update and kept us inspired with their passion and enthusiasm. If you want to help us support your language in a future update, contribute evaluations or translations through Translate Contribute.

  • Google I/O 2022: Advancing knowledge and computing Wed, 11 May 2022 17:00:00 +0000

    [TL;DR]

    Nearly 24 years ago, Google started with two graduate students, one product, and a big mission: to organize the world’s information and make it universally accessible and useful. In the decades since, we’ve been developing our technology to deliver on that mission.

    The progress we've made is because of our years of investment in advanced technologies, from AI to the technical infrastructure that powers it all. And once a year — on my favorite day of the year :) — we share an update on how it’s going at Google I/O.

    Today, I talked about how we’re advancing two fundamental aspects of our mission — knowledge and computing — to create products that are built to help. It’s exciting to build these products; it’s even more exciting to see what people do with them.

    Thank you to everyone who helps us do this work, and most especially our Googlers. We are grateful for the opportunity.

    - Sundar


    Editor’s note: Below is an edited transcript of Sundar Pichai's keynote address during the opening of today's Google I/O Developers Conference.

    Hi, everyone, and welcome. Actually, let’s make that welcome back! It’s great to return to Shoreline Amphitheatre after three years away. To the thousands of developers, partners and Googlers here with us, it’s great to see all of you. And to the millions more joining us around the world — we’re so happy you’re here, too.

    Last year, we shared how new breakthroughs in some of the most technically challenging areas of computer science are making Google products more helpful in the moments that matter. All this work is in service of our timeless mission: to organize the world's information and make it universally accessible and useful.

    I'm excited to show you how we’re driving that mission forward in two key ways: by deepening our understanding of information so that we can turn it into knowledge; and advancing the state of computing, so that knowledge is easier to access, no matter who or where you are.

    Today, you'll see how progress on these two parts of our mission ensures Google products are built to help. I’ll start with a few quick examples. Throughout the pandemic, Google has focused on delivering accurate information to help people stay healthy. Over the last year, people used Google Search and Maps to find where they could get a COVID vaccine nearly two billion times.

    A visualization of Google’s flood forecasting system, with three 3D maps stacked on top of one another, showing landscapes and weather patterns in green and brown colors. The maps are floating against a gray background.

    Google’s flood forecasting technology sent flood alerts to 23 million people in India and Bangladesh last year.

    We’ve also expanded our flood forecasting technology to help people stay safe in the face of natural disasters. During last year’s monsoon season, our flood alerts notified more than 23 million people in India and Bangladesh. And we estimate this supported the timely evacuation of hundreds of thousands of people.

    In Ukraine, we worked with the government to rapidly deploy air raid alerts. To date, we’ve delivered hundreds of millions of alerts to help people get to safety. In March I was in Poland, where millions of Ukrainians have sought refuge. Warsaw’s population has increased by nearly 20% as families host refugees in their homes, and schools welcome thousands of new students. Nearly every Google employee I spoke with there was hosting someone.

    Adding 24 more languages to Google Translate

    In countries around the world, Google Translate has been a crucial tool for newcomers and residents trying to communicate with one another. We’re proud of how it’s helping Ukrainians find a bit of hope and connection until they are able to return home again.

    Two boxes, one showing a question in English — “What’s the weather like today?” — the other showing its translation in Quechua. There is a microphone symbol below the English question and a loudspeaker symbol below the Quechua answer.

    With machine learning advances, we're able to add languages like Quechua to Google Translate.

    Real-time translation is a testament to how knowledge and computing come together to make people's lives better. More people are using Google Translate than ever before, but we still have work to do to make it universally accessible. There’s a long tail of languages that are underrepresented on the web today, and translating them is a hard technical problem. That’s because translation models are usually trained with bilingual text — for example, the same phrase in both English and Spanish. However, there's not enough publicly available bilingual text for every language.

    So with advances in machine learning, we’ve developed a monolingual approach where the model learns to translate a new language without ever seeing a direct translation of it. By collaborating with native speakers and institutions, we found these translations were of sufficient quality to be useful, and we'll continue to improve them.

    A list of the 24 new languages Google Translate now has available.

    We’re adding 24 new languages to Google Translate.

    Today, I’m excited to announce that we’re adding 24 new languages to Google Translate, including the first indigenous languages of the Americas. Together, these languages are spoken by more than 300 million people. Breakthroughs like this are powering a radical shift in how we access knowledge and use computers.

    Taking Google Maps to the next level

    So much of what’s knowable about our world goes beyond language — it’s in the physical and geospatial information all around us. For more than 15 years, Google Maps has worked to create rich and useful representations of this information to help us navigate. Advances in AI are taking this work to the next level, whether it’s expanding our coverage to remote areas, or reimagining how to explore the world in more intuitive ways.

    An overhead image of a map of a dense urban area, showing gray roads cutting through clusters of buildings outlined in blue.

    Advances in AI are helping to map remote and rural areas.

    Around the world, we’ve mapped around 1.6 billion buildings and over 60 million kilometers of roads to date. Some remote and rural areas have previously been difficult to map, due to scarcity of high-quality imagery and distinct building types and terrain. To address this, we’re using computer vision and neural networks to detect buildings at scale from satellite images. As a result, we have increased the number of buildings on Google Maps in Africa by 5X since July 2020, from 60 million to nearly 300 million.

    We’ve also doubled the number of buildings mapped in India and Indonesia this year. Globally, over 20% of the buildings on Google Maps have been detected using these new techniques. We’ve gone a step further, and made the dataset of buildings in Africa publicly available. International organizations like the United Nations and the World Bank are already using it to better understand population density, and to provide support and emergency assistance.

    Immersive view in Google Maps fuses together aerial and street level images.

    We’re also bringing new capabilities into Maps. Using advances in 3D mapping and machine learning, we’re fusing billions of aerial and street level images to create a new, high-fidelity representation of a place. These breakthrough technologies are coming together to power a new experience in Maps called immersive view: it allows you to explore a place like never before.

    Let’s go to London and take a look. Say you’re planning to visit Westminster with your family. You can get into this immersive view straight from Maps on your phone, and you can pan around the sights… here’s Westminster Abbey. If you’re thinking of heading to Big Ben, you can check if there's traffic, how busy it is, and even see the weather forecast. And if you’re looking to grab a bite during your visit, you can check out restaurants nearby and get a glimpse inside.

    What's amazing is that isn't a drone flying in the restaurant — we use neural rendering to create the experience from images alone. And Google Cloud Immersive Stream allows this experience to run on just about any smartphone. This feature will start rolling out in Google Maps for select cities globally later this year.

    Another big improvement to Maps is eco-friendly routing. Launched last year, it shows you the most fuel-efficient route, giving you the choice to save money on gas and reduce carbon emissions. Eco-friendly routes have already rolled out in the U.S. and Canada — and people have used them to travel approximately 86 billion miles, helping save an estimated half million metric tons of carbon emissions, the equivalent of taking 100,000 cars off the road.

    Still image of eco-friendly routing on Google Maps — a 53-minute driving route in Berlin is pictured, with text below the map showing it will add three minutes but save 18% more fuel.

    Eco-friendly routes will expand to Europe later this year.

    I’m happy to share that we’re expanding this feature to more places, including Europe later this year. In this Berlin example, you could reduce your fuel consumption by 18% taking a route that’s just three minutes slower. These small decisions have a big impact at scale. With the expansion into Europe and beyond, we estimate carbon emission savings will double by the end of the year.

    And we’ve added a similar feature to Google Flights. When you search for flights between two cities, we also show you carbon emission estimates alongside other information like price and schedule, making it easy to choose a greener option. These eco-friendly features in Maps and Flights are part of our goal to empower 1 billion people to make more sustainable choices through our products, and we’re excited about the progress here.

    New YouTube features to help people easily access video content

    Beyond Maps, video is becoming an even more fundamental part of how we share information, communicate, and learn. Often when you come to YouTube, you are looking for a specific moment in a video and we want to help you get there faster.

    Last year we launched auto-generated chapters to make it easier to jump to the part you’re most interested in.

    This is also great for creators because it saves them time making chapters. We’re now applying multimodal technology from DeepMind. It simultaneously uses text, audio and video to auto-generate chapters with greater accuracy and speed. With this, we now have a goal to 10X the number of videos with auto-generated chapters, from eight million today, to 80 million over the next year.

    Often the fastest way to get a sense of a video’s content is to read its transcript, so we’re also using speech recognition models to transcribe videos. Video transcripts are now available to all Android and iOS users.

    Animation showing a video being automatically translated. Then text reads "Now available in sixteen languages."

    Auto-translated captions on YouTube.

    Next up, we’re bringing auto-translated captions on YouTube to mobile. Which means viewers can now auto-translate video captions in 16 languages, and creators can grow their global audience. We’ll also be expanding auto-translated captions to Ukrainian YouTube content next month, part of our larger effort to increase access to accurate information about the war.

    Helping people be more efficient with Google Workspace

    Just as we’re using AI to improve features in YouTube, we’re building it into our Workspace products to help people be more efficient. Whether you work for a small business or a large institution, chances are you spend a lot of time reading documents. Maybe you’ve felt that wave of panic when you realize you have a 25-page document to read ahead of a meeting that starts in five minutes.

    At Google, whenever I get a long document or email, I look for a TL;DR at the top — TL;DR is short for “Too Long, Didn’t Read.” And it got us thinking, wouldn’t life be better if more things had a TL;DR?

    That’s why we’ve introduced automated summarization for Google Docs. Using one of our machine learning models for text summarization, Google Docs will automatically parse the words and pull out the main points.

    This marks a big leap forward for natural language processing. Summarization requires understanding of long passages, information compression and language generation, which used to be outside of the capabilities of even the best machine learning models.

    And docs are only the beginning. We’re launching summarization for other products in Workspace. It will come to Google Chat in the next few months, providing a helpful digest of chat conversations, so you can jump right into a group chat or look back at the key highlights.

    Animation showing summary in Google Chat

    We’re bringing summarization to Google Chat in the coming months.

    And we’re working to bring transcription and summarization to Google Meet as well so you can catch up on some important meetings you missed.

    Visual improvements on Google Meet

    Of course there are many moments where you really want to be in a virtual room with someone. And that’s why we continue to improve audio and video quality, inspired by Project Starline. We introduced Project Starline at I/O last year. And we’ve been testing it across Google offices to get feedback and improve the technology for the future. And in the process, we’ve learned some things that we can apply right now to Google Meet.

    Starline inspired machine learning-powered image processing to automatically improve your image quality in Google Meet. And it works on all types of devices so you look your best wherever you are.

    An animation of a man looking directly at the camera then waving and smiling. A white line sweeps across the screen, adjusting the image quality to make it brighter and clearer.

    Machine learning-powered image processing automatically improves image quality in Google Meet.

    We’re also bringing studio quality virtual lighting to Meet. You can adjust the light position and brightness, so you’ll still be visible in a dark room or sitting in front of a window. We’re testing this feature to ensure everyone looks like their true selves, continuing the work we’ve done with Real Tone on Pixel phones and the Monk Scale.

    These are just some of the ways AI is improving our products: making them more helpful, more accessible, and delivering innovative new features for everyone.

    Gif shows a phone camera pointed towards a rack of shelves, generating helpful information about food items. Text on the screen shows the words ‘dark’, ‘nut-free’ and ‘highly-rated’.

    Today at I/O Prabhakar Raghavan shared how we’re helping people find helpful information in more intuitive ways on Search.

    Making knowledge accessible through computing

    We’ve talked about how we’re advancing access to knowledge as part of our mission: from better language translation to improved Search experiences across images and video, to richer explorations of the world using Maps.

    Now we’re going to focus on how we make that knowledge even more accessible through computing. The journey we’ve been on with computing is an exciting one. Every shift, from desktop to the web to mobile to wearables and ambient computing has made knowledge more useful in our daily lives.

    As helpful as our devices are, we’ve had to work pretty hard to adapt to them. I’ve always thought computers should be adapting to people, not the other way around. We continue to push ourselves to make progress here.

    Here’s how we’re making computing more natural and intuitive with the Google Assistant.

    Introducing LaMDA 2 and AI Test Kitchen

    Animation shows demos of how LaMDA can converse on any topic and how AI Test Kitchen can help create lists.

    A demo of LaMDA, our generative language model for dialogue application, and the AI Test Kitchen.

    We're continually working to advance our conversational capabilities. Conversation and natural language processing are powerful ways to make computers more accessible to everyone. And large language models are key to this.

    Last year, we introduced LaMDA, our generative language model for dialogue applications that can converse on any topic. Today, we are excited to announce LaMDA 2, our most advanced conversational AI yet.

    We are at the beginning of a journey to make models like these useful to people, and we feel a deep responsibility to get it right. To make progress, we need people to experience the technology and provide feedback. We opened LaMDA up to thousands of Googlers, who enjoyed testing it and seeing its capabilities. This yielded significant quality improvements, and led to a reduction in inaccurate or offensive responses.

    That’s why we’ve made AI Test Kitchen. It’s a new way to explore AI features with a broader audience. Inside the AI Test Kitchen, there are a few different experiences. Each is meant to give you a sense of what it might be like to have LaMDA in your hands and use it for things you care about.

    The first is called “Imagine it.” This demo tests if the model can take a creative idea you give it, and generate imaginative and relevant descriptions. These are not products, they are quick sketches that allow us to explore what LaMDA can do with you. The user interfaces are very simple.

    Say you’re writing a story and need some inspirational ideas. Maybe one of your characters is exploring the deep ocean. You can ask what that might feel like. Here LaMDA describes a scene in the Mariana Trench. It even generates follow-up questions on the fly. You can ask LaMDA to imagine what kinds of creatures might live there. Remember, we didn’t hand-program the model for specific topics like submarines or bioluminescence. It synthesized these concepts from its training data. That’s why you can ask about almost any topic: Saturn’s rings or even being on a planet made of ice cream.

    Staying on topic is a challenge for language models. Say you’re building a learning experience — you want it to be open-ended enough to allow people to explore where curiosity takes them, but stay safely on topic. Our second demo tests how LaMDA does with that.

    In this demo, we’ve primed the model to focus on the topic of dogs. It starts by generating a question to spark conversation, “Have you ever wondered why dogs love to play fetch so much?” And if you ask a follow-up question, you get an answer with some relevant details: it’s interesting, it thinks it might have something to do with the sense of smell and treasure hunting.

    You can take the conversation anywhere you want. Maybe you’re curious about how smell works and you want to dive deeper. You’ll get a unique response for that too. No matter what you ask, it will try to keep the conversation on the topic of dogs. If I start asking about cricket, which I probably would, the model brings the topic back to dogs in a fun way.

    This challenge of staying on-topic is a tricky one, and it’s an important area of research for building useful applications with language models.

    These experiences show the potential of language models to one day help us with things like planning, learning about the world, and more.

    Of course, there are significant challenges to solve before these models can truly be useful. While we have improved safety, the model might still generate inaccurate, inappropriate, or offensive responses. That’s why we are inviting feedback in the app, so people can help report problems.

    We will be doing all of this work in accordance with our AI Principles. Our process will be iterative, opening up access over the coming months, and carefully assessing feedback with a broad range of stakeholders — from AI researchers and social scientists to human rights experts. We’ll incorporate this feedback into future versions of LaMDA, and share our findings as we go.

    Over time, we intend to continue adding other emerging areas of AI into AI Test Kitchen. You can learn more at: g.co/AITestKitchen.

    Advancing AI language models

    LaMDA 2 has incredible conversational capabilities. To explore other aspects of natural language processing and AI, we recently announced a new model. It’s called Pathways Language Model, or PaLM for short. It’s our largest model to date and trained on 540 billion parameters.

    PaLM demonstrates breakthrough performance on many natural language processing tasks, such as generating code from text, answering a math word problem, or even explaining a joke.

    It achieves this through greater scale. And when we combine that scale with a new technique called chain-of- thought prompting, the results are promising. Chain-of-thought prompting allows us to describe multi-step problems as a series of intermediate steps.

    Let’s take an example of a math word problem that requires reasoning. Normally, how you use a model is you prompt it with a question and answer, and then you start asking questions. In this case: How many hours are in the month of May? So you can see, the model didn’t quite get it right.

    In chain-of-thought prompting, we give the model a question-answer pair, but this time, an explanation of how the answer was derived. Kind of like when your teacher gives you a step-by-step example to help you understand how to solve a problem. Now, if we ask the model again — how many hours are in the month of May — or other related questions, it actually answers correctly and even shows its work.

    There are two boxes below a heading saying ‘chain-of-thought prompting’. A box headed ‘input’ guides the model through answering a question about how many tennis balls a person called Roger has. The output box shows the model correctly reasoning through and answering a separate question (‘how many hours are in the month of May?’)

    Chain-of-thought prompting leads to better reasoning and more accurate answers.

    Chain-of-thought prompting increases accuracy by a large margin. This leads to state-of-the-art performance across several reasoning benchmarks, including math word problems. And we can do it all without ever changing how the model is trained.

    PaLM is highly capable and can do so much more. For example, you might be someone who speaks a language that’s not well-represented on the web today — which makes it hard to find information. Even more frustrating because the answer you are looking for is probably out there. PaLM offers a new approach that holds enormous promise for making knowledge more accessible for everyone.

    Let me show you an example in which we can help answer questions in a language like Bengali — spoken by a quarter billion people. Just like before we prompt the model with two examples of questions in Bengali with both Bengali and English answers.

    That’s it, now we can start asking questions in Bengali: “What is the national song of Bangladesh?” The answer, by the way, is “Amar Sonar Bangla” — and PaLM got it right, too. This is not that surprising because you would expect that content to exist in Bengali.

    You can also try something that is less likely to have related information in Bengali such as: “What are popular pizza toppings in New York City?” The model again answers correctly in Bengali. Though it probably just stirred up a debate amongst New Yorkers about how “correct” that answer really is.

    What’s so impressive is that PaLM has never seen parallel sentences between Bengali and English. Nor was it ever explicitly taught to answer questions or translate at all! The model brought all of its capabilities together to answer questions correctly in Bengali. And we can extend the techniques to more languages and other complex tasks.

    We're so optimistic about the potential for language models. One day, we hope we can answer questions on more topics in any language you speak, making knowledge even more accessible, in Search and across all of Google.

    Introducing the world’s largest, publicly available machine learning hub

    The advances we’ve shared today are possible only because of our continued innovation in our infrastructure. Recently we announced plans to invest $9.5 billion in data centers and offices across the U.S.

    One of our state-of-the-art data centers is in Mayes County, Oklahoma. I’m excited to announce that, there, we are launching the world’s largest, publicly-available machine learning hub for our Google Cloud customers.

    Still image of a data center with Oklahoma map pin on bottom left corner.

    One of our state-of-the-art data centers in Mayes County, Oklahoma.

    This machine learning hub has eight Cloud TPU v4 pods, custom-built on the same networking infrastructure that powers Google’s largest neural models. They provide nearly nine exaflops of computing power in aggregate — bringing our customers an unprecedented ability to run complex models and workloads. We hope this will fuel innovation across many fields, from medicine to logistics, sustainability and more.

    And speaking of sustainability, this machine learning hub is already operating at 90% carbon-free energy. This is helping us make progress on our goal to become the first major company to operate all of our data centers and campuses globally on 24/7 carbon-free energy by 2030.

    Even as we invest in our data centers, we are working to innovate on our mobile platforms so more processing can happen locally on device. Google Tensor, our custom system on a chip, was an important step in this direction. It’s already running on Pixel 6 and Pixel 6 Pro, and it brings our AI capabilities — including the best speech recognition we’ve ever deployed — right to your phone. It’s also a big step forward in making those devices more secure. Combined with Android’s Private Compute Core, it can run data-powered features directly on device so that it’s private to you.

    People turn to our products every day for help in moments big and small. Core to making this possible is protecting your private information each step of the way. Even as technology grows increasingly complex, we keep more people safe online than anyone else in the world, with products that are secure by default, private by design and that put you in control.

    We also spent time today sharing updates to platforms like Android. They’re delivering access, connectivity, and information to billions of people through their smartphones and other connected devices like TVs, cars and watches.

    And we shared our new Pixel Portfolio, including the Pixel 6a, Pixel Buds Pro, Google Pixel Watch, Pixel 7, and Pixel tablet all built with ambient computing in mind. We’re excited to share a family of devices that work better together — for you.

    The next frontier of computing: augmented reality

    Today we talked about all the technologies that are changing how we use computers and access knowledge. We see devices working seamlessly together, exactly when and where you need them and with conversational interfaces that make it easier to get things done.

    Looking ahead, there's a new frontier of computing, which has the potential to extend all of this even further, and that is augmented reality. At Google, we have been heavily invested in this area. We’ve been building augmented reality into many Google products, from Google Lens to multisearch, scene exploration, and Live and immersive views in Maps.

    These AR capabilities are already useful on phones and the magic will really come alive when you can use them in the real world without the technology getting in the way.

    That potential is what gets us most excited about AR: the ability to spend time focusing on what matters in the real world, in our real lives. Because the real world is pretty amazing!

    It’s important we design in a way that is built for the real world — and doesn’t take you away from it. And AR gives us new ways to accomplish this.

    Let’s take language as an example. Language is just so fundamental to connecting with one another. And yet, understanding someone who speaks a different language, or trying to follow a conversation if you are deaf or hard of hearing can be a real challenge. Let's see what happens when we take our advancements in translation and transcription and deliver them in your line of sight in one of the early prototypes we’ve been testing.

    You can see it in their faces: the joy that comes with speaking naturally to someone. That moment of connection. To understand and be understood. That’s what our focus on knowledge and computing is all about. And it’s what we strive for every day, with products that are built to help.

    Each year we get a little closer to delivering on our timeless mission. And we still have so much further to go. At Google, we genuinely feel a sense of excitement about that. And we are optimistic that the breakthroughs you just saw will help us get there. Thank you to all of the developers, partners and customers who joined us today. We look forward to building the future with all of you.

  • The Google Store is coming to Brooklyn Wed, 11 May 2022 16:55:00 +0000

    A year ago, we opened the doors to Google’s first-ever physical retail store in New York City. Since opening this flagship store in the iconic Chelsea neighborhood, we’ve heard how useful it is to try out our products in person — like giving the Pixel 6 Pro a spin or listening to a YouTube playlist on Nest Audio. Now, we’re bringing this experience to even more New Yorkers.

    Today, we’re announcing our plans to open our second physical store in Williamsburg, Brooklyn. The Google Store Williamsburg will be the first of our “neighborhood stores,” offering similar hands-on experiences with our products and services as our flagship store, but in a more intimate setting that celebrates the unique neighborhood we’re in. We’ll start welcoming customers to our new location at 134 N 6th Street on June 16.

    Inside our first neighborhood store

    As soon as you walk through the door at the Google Store Williamsburg, you’ll find an installation by Brooklyn-based artist Olalekan Jeyifous, whose work examines the relationships between architecture, community and the environment. We’ll also host local events to celebrate Brooklyn, like guided walks around the neighborhood where you can try out Pixel photography features.

    Underneath the installation, you’ll find our Here to Help desk. Our Chelsea flagship store visitors have told us they appreciate getting support, like Pixel phone repairs, directly from Google experts — so we’re bringing this to the Google Store Williamsburg, too.

    You’ll also get the chance to picture everyday life with our products through interactive displays that show how our hardware and services work together. For example, you can explore Google Fi phone plans, discover which Pixel color best suits your personality or learn what goes into making our phone cases more sustainable. Meanwhile, kick back and relax on our couches to imagine what it would be like to use Google products at home — an area that will also serve as a space for local events and workshops.

    And just like at our flagship store, you’ll be able to easily find a product at the Grab & Go wall or pick up a pre-order that you placed with the Google Store online. No matter what your reason is for stopping by, we’ll help you find what you need.

    Feedback plays a big role in improving our stores, and we’ll keep listening to make sure you get the most helpful shopping experience — from Manhattan to Brooklyn. We look forward to welcoming you to the Google Store Williamsburg in June!

  • What’s beta than Android 13? Wed, 11 May 2022 16:14:00 +0000

    Every year and with every release, we make Android better based on your feedback. With Android 13, we’re continuing to improve the quality and performance of the platform while building on many areas that matter most to you, like privacy and security, personalization and large-screen devices.

    Today, we’re sharing more about Android 13 and releasing the second beta across many Android phones, tablets and foldable devices.

    A foundation of privacy and security

    In Android 13, we’re giving you more control over what personal information you share and more detailed control over what files your apps can access. Instead of permitting access to “Files and media,” there are two new categories you can control access to: “Photos & videos” and “Music & audio.” For even more specificity, a new photo picker lets you select the exact photos or videos you want to grant access to, without needing to share your entire media library with an app.

    We’re also helping you be more deliberate about how you engage with apps. While app notifications often provide helpful and timely reminders, you should have more control over which apps you want to receive notifications from. In Android 13, apps must get your permission before sending you notifications. In addition, we’re reducing the number of apps that require your location. For example, you will no longer need to grant location to apps to enable Wi-Fi scanning.

    Android 13 goes further to help you stay ahead of risks, with timely recommendations and options to enhance your privacy. You already receive an alert when an app accesses your clipboard. Now, Android will go further and automatically delete your clipboard history after a short period so apps are preemptively blocked from seeing old copied information.

    Later this year, we’ll introduce a unified Security & Privacy settings page in Android 13 that brings all your device’s data privacy and security front and center. This will provide a clear, color-coded indicator of your safety status and offer guidance and steps you can take to boost your security.

    Within the Security & Privacy settings page, there is a color-coded safety status that indicates safety status. On the top of this user’s screen it  reads “Looks good” with a green check mark beside it.

    Personalized experiences for you

    Last year, we introduced Material You to help your phone adapt to your style and preferences. With Android 13, we’re going further to customize your phone’s look and feel with pre-made color variants. Once a color scheme has been selected, you’ll see beautiful color variants applied across the entire OS to accentuate your wallpaper and style.

    Different color variants applied across the calculator app of 4 phone shells on a floral orange wallpaper

    Android 13 also extends color theming of your app icons beyond Google apps. Starting with Pixel devices, you’ll be able to turn on the “Themed icons” toggle in your settings to have all supported apps also match your phone’s colors in a minimal, modern and consistent look.

    Apps on the home screen are all light orange and gray to match the orange floral wallpaper

    We’re also introducing a new media control that tailors its look based on the music that you’re listening to, featuring the album’s artwork.

    A phone’s lockscreen with a media player in the center with colorful artwork playing a song.

    Personalization in Android 13 extends beyond the design and aesthetic of the phone’s interface to other areas that are important and unique to you, like your language preferences. If you’re multilingual, you likely use different languages depending on the situation and may change how you communicate from one instance to the next. For example, you might enjoy social media in one language, but bank in another. Android 13 helps you use language as fluidly as you do in real life, so you can select a different language preference for each of your apps in Settings.

    While in Settings, a user selects ShareChat and has a list of languages to choose the app to run in such as the System language, Hindi, Marathi, Bengali, Gujarati, Punjabi and more

    Tablets just keep getting better

    Android 12L’s updates optimized the layout for bigger screen devices. Android 13 builds on this foundation by introducing better multitasking capabilities for tablets. With the updated taskbar, you can easily switch your single tablet view to a split screen. Just drag and drop any second app in your app library onto your screen and you’ll be able to do two or more things at once with ease.

    A tablet user drags and drops apps like Google Photos and Gmail into split screen from the new All Apps entry point in their taskbar.

    We’re also improving the experience for when you’re writing or drawing with a stylus pen. In Android 13, you can rest your hand comfortably on the screen without worrying about it being misidentified as a stylus pen, reducing any unintended actions.

    We know these changes don’t mean much if apps aren’t built for the larger screens. So over the next few weeks, we’ll be updating more than 20 Google apps to take full advantage of the extra space with added functionality. Many of the third-party apps you love — like TikTok, Facebook and Zoom — will be revamped to make your experiences on tablets even better.

    Try out Android 13 features, with more on the way

    Android 13 has much more in store, including features that shape modern standards for audio and video like HDR video, Spatial Audio and Bluetooth Low Energy Audio.

    You can find many of these features today in the second beta of Android 13. We have a great lineup of beta partners and we can’t wait for you to try it on your favorite device.

  • Search your world, any way and anywhere Wed, 11 May 2022 16:05:00 +0000

    People have always gathered information in a variety of ways — from talking to others, to observing the world around them, to, of course, searching online. Though typing words into a search box has become second nature for many of us, it’s far from the most natural way to express what we need. For example, if I’m walking down the street and see an interesting tree, I might point to it and ask a friend what species it is and if they know of any nearby nurseries that might sell seeds. If I were to express that question to a search engine just a few years ago… well, it would have taken a lot of queries.

    But we’ve been working hard to change that. We've already started on a journey to make searching more natural. Whether you're humming the tune that's been stuck in your head, or using Google Lens to search visually (which now happens more than 8 billion times per month!), there are more ways to search and explore information than ever before.

    Today, we're redefining Google Search yet again, combining our understanding of all types of information — text, voice, visual and more — so you can find helpful information about whatever you see, hear and experience, in whichever ways are most intuitive to you. We envision a future where you can search your whole world, any way and anywhere.

    Find local information with multisearch

    The recent launch of multisearch, one of our most significant updates to Search in several years, is a milestone on this path. In the Google app, you can search with images and text at the same time — similar to how you might point at something and ask a friend about it.

    Now we’re adding a way to find local information with multisearch, so you can uncover what you need from the millions of local businesses on Google. You’ll be able to use a picture or screenshot and add “near me” to see options for local restaurants or retailers that have the apparel, home goods and food you’re looking for.

    An animation of a phone showing a search. A photo is taken of Korean cuisine, then Search scans it for restaurants near the user that serve it.

    Later this year, you’ll be able to find local information with multisearch.

    For example, say you see a colorful dish online you’d like to try – but you don’t know what’s in it, or what it’s called. When you use multisearch to find it near you, Google scans millions of images and reviews posted on web pages, and from our community of Maps contributors, to find results about nearby spots that offer the dish so you can go enjoy it for yourself.

    Local information in multisearch will be available globally later this year in English, and will expand to more languages over time.

    Get a more complete picture with scene exploration

    Today, when you search visually with Google, we’re able to recognize objects captured in a single frame. But sometimes, you might want information about a whole scene in front of you.

    In the future, with an advancement called “scene exploration,” you’ll be able to use multisearch to pan your camera and instantly glean insights about multiple objects in a wider scene.

    In the future, “scene exploration” will help you uncover insights across multiple objects in a scene at the same time.

    Imagine you’re trying to pick out the perfect candy bar for your friend who's a bit of a chocolate connoisseur. You know they love dark chocolate but dislike nuts, and you want to get them something of quality. With scene exploration, you’ll be able to scan the entire shelf with your phone’s camera and see helpful insights overlaid in front of you. Scene exploration is a powerful breakthrough in our devices’ ability to understand the world the way we do – so you can easily find what you’re looking for– and we look forward to bringing it to multisearch in the future.

    These are some of the latest steps we’re taking to help you search any way and anywhere. But there’s more we’re doing, beyond Search. AI advancements are helping bridge the physical and digital worlds in Google Maps, and making it possible to interact with the Google Assistant more naturally and intuitively. To ensure information is truly useful for people from all communities, it’s also critical for people to see themselves represented in the results they find. Underpinning all these efforts is our commitment to helping you search safely, with new ways to control your online presence and information.

  • I/O 2022 Wed, 11 May 2022 16:00:00 +0000



Google Ads
Many books were created to help people understand how Google works, its corporate culture and how to use its services and products. The following books are available: Ultimate Guide to Google AdsThe Ridiculously Simple Guide to Google Docs: A Practical Guide to Cloud-Based Word ProcessingMastering Google Adwords: Step-by-Step Instructions for Advertising Your Business (Including Google Analytics)Google Classroom: Definitive Guide for Teachers to Learn Everything About Google Classroom and Its Teaching Apps. Tips and Tricks to Improve Lessons’ Quality.3 Months to No.1: The "No-Nonsense" SEO Playbook for Getting Your Website Found on GoogleUltimate Guide to Google AdsGoogle AdSense Made Easy: Monetize Your Website and Blogs Instantly With These Proven Google Adsense TechniquesUltimate Guide to Google AdWords: How to Access 100 Million People in 10 Minutes (Ultimate Series)


Google Cloud Blog
TwitterFacebookInstagramYouTube

  • New observability features for your Splunk Dataflow streaming pipelines Fri, 13 May 2022 16:00:00 -0000

    We’re thrilled to announce several new observability features for the Pub/Sub to Splunk Dataflow template to help operators keep a tab on their streaming pipeline performance. Splunk Enterprise and Splunk Cloud customers use the Splunk Dataflow template to reliably export Google Cloud logs for in-depth analytics for security, IT or business use cases. With newly added metrics and improved logging for Splunk IO sink, it’s now easier to answer operational questions such as:

    • Is the Dataflow pipeline keeping up with the volume of logs generated?

    • What is the latency and throughput (Event Per Second or EPS) when writing to Splunk?

    • What is the response status breakdown of downstream Splunk HTTP Event Collector (HEC) and potential error messages?

    This critical visibility helps you derive your log export service-level indicators (SLIs) and monitor for any pipeline performance regressions. You can also more easily root cause potential downstream failures between Dataflow & Splunk such as Splunk HEC network connections or server issues, and fix the problem before it cascades.

    To help you quickly chart these new metrics, we’ve included them in the custom dashboard as part of the updated Terraform module for Splunk Dataflow. You can use those Terraform templates to deploy the entire infrastructure for log export to Splunk, or just the Monitoring dashboard alone.

    1 Log Export Ops Dashboard for Splunk Dataflow.jpg
    Log Export Ops Dashboard for Splunk Dataflow

    More metrics

    In your Dataflow Console, you may have noticed several new custom metrics (highlighted below) for launched jobs as of template version 2022-03-21-00_RC01, that is gs://dataflow-templates/2022-03-21-00_RC01/Cloud_PubSub_to_Splunk or later:

    2 More metrics.jpg

    Pipeline instrumentation

    Before we dive into the new metrics, let’s take a step back and go over the Splunk Dataflow job steps. The following flowchart represents the different stages that comprise a Splunk Dataflow job along with corresponding custom metrics:

    3 Pipeline instrumentation.jpg

    In this pipeline, we utilize two types of Apache Beam custom metrics:

    • Counter metrics, labeled 1 through 10 above, used to count messages and requests (both successful and failed).
    • Distribution metrics, labeled A through C above, used to report on distribution of request latency (both successful and failed) and batch size. 

    Downstream request visibility

    Splunk Dataflow operators have relied on some of these pre-built custom metrics to monitor log messages progress through the different pipeline stages, particularly in the last stage Write To Splunk, with metrics outbound-successful-events (counter #6 above) and outbound-failed-events (counter #7 above) to track the number of messages that were successfully exported (or not) to Splunk. While operators had visibility of the outbound message success rate, they lacked visibility at the HEC request level. Splunk Dataflow operators can now monitor not only the number of successful and failed HEC requests over time, but also the response status breakdown to determine if request failed due to a client request issue (e.g. invalid Splunk index or HEC token), or a transient network or Splunk issue (e.g. server busy or down) all from Dataflow Console with the addition of counters #7-10 above, that is:

    • http-valid-requests
    • http-invalid-requests
    • http-server-error-requests

    Splunk Dataflow operators can also now track average latency of downstream requests to Splunk HEC, as well as average request batch size, by using the new distribution metrics #A-C, that is:

    • successful_write_to_splunk_latency_ms
    • unsuccessful_write_to_splunk_latency_ms
    • write_to_splunk_batch

    Note that a Distribution metric in Beam is reported by Dataflow as four sub-metrics suffixed with _MAX, _MIN, _MEAN and _COUNT. That is why those 3 new distribution metrics translate to 12 new metrics in Cloud Monitoring, as you can see in the earlier job info screenshot from Dataflow Console. Dataflow currently does not support creating a histogram to visualize the breakdown of these metrics’ values. Therefore, _MEAN metric is the only useful sub-metric for our purposes. As an all-time average value, _MEAN cannot be used to track changes over arbitrary time intervals (e.g. hourly), but it is useful to capture baseline, track trend or to compare different pipelines.

    Dataflow custom metrics, including aforementioned metrics reported by Splunk Dataflow template, are a chargeable feature of Cloud Monitoring. For more information on metrics pricing, see Pricing for Cloud Monitoring.

    Improved logging

    Logging HEC errors

    To further root cause downstream issues, HEC request errors are now adequately logged, including both response status code and message:

    4 Logging HEC errors.jpg

    You can retrieve them directly in Worker Logs from Dataflow Console by setting log severity to Error.

    Alternatively, for those who prefer using Logs Explorer, you can use the following query.

    code_block
    [StructValue([(u'code', u'log_id("dataflow.googleapis.com/worker")\r\nresource.type="dataflow_step"\r\nresource.labels.step_id="WriteToSplunk/Write Splunk events"\r\nseverity=ERROR'), (u'language', u'')])]

    Disabling batch logs

    By default, Splunk Dataflow workers log every HEC request as follows:
    5 Disabling batch logs.jpg

    Even though these requests are often batched events, these ‘batch logs’ are chatty as they add 2 log messages for every HEC request. With the addition of request-level counters (http-*-requests), latency & batch size distributions, and HEC error logging mentioned above, these batch logs are generally redundant. To control worker log volume, you can now disable these batch logs by setting the new optional template parameter enableBatchLogs to false, when deploying the Splunk Dataflow job. For more details on latest template parameters, refer to template user documentation.

    Enabling debug level logs

    The default logging level for Google provided templates written using the Apache Beam Java SDK is INFO, which means all messages of INFO and higher i.e. WARN and ERROR will be logged. If you’d like to enable lower log levels like DEBUG, you can do so by setting the --defaultWorkerLogLevel flag to DEBUG while starting the pipeline using gcloud command-line tool. 

    You can also override log levels for specific packages or classes with the --workerLogLevelOverridesflag. For example, the HttpEventPublisher class logs the final payload sent to Splunk at the DEBUG level. You can set the --workerLogLevelOverridesflag to {"com.google.cloud.teleport.splunk.HttpEventPublisher":"DEBUG"} to view the final message in the logs before it is sent to Splunk, and keep the log level at INFO for other classes. Exercise caution while using this as it will log all messages sent to Splunk under the Worker Logs tab in the console, which might lead to log throttling or reveal sensitive information.

    Putting it all together

    We put all this together in a single Monitoring dashboard that you can readily use to monitor your log export operations:

    6 Pipeline Throughput.jpg
    Pipeline Throughput, Latency & Errors

    This dashboard is a single pane of glass for monitoring your Pub/Sub to Splunk Dataflow pipeline. Use it to ensure your log export is meeting your dynamic log volume requirements, by scaling to adequate throughput (EPS) rate, while keeping latency and backlog to a minimum. There’s also a panel to track pipeline resource usage and utilization, to help you validate that the pipeline is running cost-efficiently during steady-state.

    7 Pipeline Utilization.jpg
    Pipeline Utilization and Worker Logs

    For specific guidance on handling and replaying failed messages, refer to Troubleshoot failed messages as part of the Splunk Dataflow reference guide. For general information on troubleshooting any Dataflow pipeline, check out the Troubleshooting and debugging documentation, and for a list of common errors and their resolutions look through the Common error guidance documentation. If you encounter any issue, please open an issue in the Dataflow templates GitHub repository, or open a support case directly in your Google Cloud Console.

    For a step-by-step guide on how to export GCP logs to Splunk, check out the Deploy production-ready log exports to Splunk using Dataflow tutorial, or use the accompanying Terraform scripts to automate the setup of your log export infrastructure along with the associated operational dashboard.

    Related Article

    What’s new with Splunk Dataflow template: Automatic log parsing, UDF support, and more

    Announcing new features for Splunk Dataflow template with improved compatibility with Splunk Add-on for GCP, more extensibility using use...

    Read Article
  • Google’s open-source solution to DFDL Processing Fri, 13 May 2022 16:00:00 -0000

    The cloud has become the choice for extending and modernizing applications, but there are some situations where the transition is not straightforward, such as migrating applications that access data from a mainframe environment.  Migrating the data and the applications at certain points can be outsync.  Mechanisms need to be in place during the transition to support interoperability with legacy workloads and  access data out of the mainframe.  For the latter, the Data Format Description Language  (DFDL) which is an open standard modeling language from the Open Grid Forum (OGF), has been used to access data from a mainframe, e.g. IBM Integration Bus.  

    DFDL uses a model or schema that allows text or binary data to be parsed from its native format and to be presented as an information set out of the mainframe (i.e., logical representation of the data contents, independent of the physical format). 

    DFDL Processing with IBM App Connect

    If we talk about solutions for parsing and processing data described by DFDL, one of the options in the past has been IBM App Connect which allows development of custom solutions via IBM DFDL. The following diagram represents a high-level architecture of DFDL Solution implementation on IBM App Connect:

    1 DFDL Processing.jpg

    IBM App Connect brings stable integration to the table at an enterprise level cost. According to IBM’s sticker pricing as of May 2022, IBM App Connect charges $500 and above per month for using the App Connect with IBM Cloud services. These prices are excluding the cost of storing and maintaining DFDL Definitions in the Mainframe. With the introduction of Tailored Fit Pricing on IBMz15, cost of maintaining the mainframe can range from $4900 to $9300 per month over the span of 5 years, which may be costly for a small/medium business only wanting to process data defined by DFDL.

    Introducing Google Open-Source DFDL Processor with Google Cloud

    At Google our mission is to build for everyone, everywhere. With this commitment in mind, the Google Cloud team has developed and open-sourced the solution for DFDL Processor which can be easily accessible and customizable for organizations to  use it.

    We understand that mainframes can be expensive to maintain and use, which is why we have integrated Cloud Firestore and Bigtable as the databases to store the DFDL definitions. Firestore can provide 100K reads, 25K writes, 100K deletes, and 1TB of storage per month for approximately $186 per month. While on the other hand Bigtable provides a fast, scalable database solution for storing terabytes, or even petabytes of data at a relatively lower cost too. This move away from the mainframe and adopting cloud-native database solutions can save organizations thousands of dollars every month.

    Next, we have substituted App Connect with a combination of our open-source DFDL processor, Cloud Pub/Sub service and open-source Apache Daffodil Library. Pub/Sub provides the connection between the mainframe and the processor, and from the processor to the downstream applications. The Daffodil Library helps in compiling schemas, and outputting infosets for the given DFDL definition and message. The total cost of employing the Pub/Sub service and the Daffodil Library comes out to be approximately $117 per month, which means an organization can save a minimum of $380 per month by using this solution.

    The table below shows a summary of the cost difference breakdown between the solutions as discussed above:

    DFDL Processing.jpg

    How it works

    The data described by the DFDL usually needs to be available in widely used formats such as JSON, in order to be consumed by downstream applications which might  have already been migrated to a cloud native environment. To achieve the consumption of the data, cloud native applications/services can be implemented in conjunction with Google Cloud Services, which accepts the textual or binary data as input from the mainframe , fetches corresponding DFDL from a database, and finally compiles and outputs the equivalent JSON for the downstreaming applications to consume.

    The following diagram describes a high level architecture to be presented

    2 DFDL Processing.jpg

    An application can be built to process the information being received from the mainframe, e.g a DFDL Processor Service, leveraging the Daffodil API to parse the data against a corresponding DFDL schema and output the JSON. 

    DFDL schema definitions can be potentially migrated and stored in Firestore or Bigtable. Since these definitions rarely change and they can be stored in a key-value pair format, the storage of preference is a non-relational managed database. 

    Google Cloud Pub/Sub, can leverage an eventing mechanism that receives the binary/textual message from a Data Source, i.e. the mainframe, in a Pub/Sub topic.  This feature will  allow the DFDL Processor to access the data, to retrieve the corresponding DFDL definition from Firestore or Bigtable and finally pass both on to the Daffodil API to compile and output the JSON result. The JSON result is finally published into a resulting Pub/Sub topic for any downstream application to consume. It is recommended to follow CloudEvent schema specification which allows to describe events in common formats, providing interoperability across services platforms and systems.

    You can find examples of the implementation in Github:  

    Conclusion

    In this post, we have discussed different pipelines used to process data defined by DFDL, and cost comparisons of these pipelines. Additionally, we have demonstrated how to use Cloud Pub/Sub, Firestore, and Bigtable to create a service which is capable of listening to binary event messages,  extract the corresponding DFDL definition from a  managed database, and process it to output a JSON which can then be consumed by downstream applications using well-established technologies and libraries.


    1. Price comparison analysis as of May 2022 and subject to change based on usage

    Related Article

    5 principles for cloud-native architecture—what it is and how to master it

    Learn to maximize your use of Google Cloud by adopting a cloud-native architecture.

    Read Article
  • How a top gaming company transformed its approach to security with Splunk and Google Cloud Fri, 13 May 2022 16:00:00 -0000

    Since Aristocrat’s founding in 1953, technology has constantly transformed gaming and the digital demands on our gaming business are a far cry from challenges we faced when we started. As we continue to expand globally, security and compliance are top priorities. 

    Managing IT security for several gaming subsidiaries and our core business became more complex as we entered into new markets and scaled up our number of users. We needed a centralized platform that could give us full visibility into all of our systems and efficient monitoring capabilities to keep data and applications secure. We also needed the ability to secure our systems without compromising user experiences.

    We turned to Google Cloud and Splunk to better manage complexity and support highly efficient, secure, and more dynamic gaming experiences for everyone. We are committed to using today’s modern technologies to give players more optimal experiences.

    Bringing our digital footprint into the cloud

    When we set out on our digital transformation, we looked to address many business requirements.

     These requirements included:

    • Regulation: We wanted a platform that could efficiently address our industry’s stringent and global regulatory compliance requirements. 

    • Player experience: Our IT environment must support smooth gaming experiences to keep users engaged and satisfied.

    • Scalability: As we grow and diversify, meeting the changing demands of an increasingly global gaming community, we need an easily scalable platform to align with our current and future needs.

    Google Cloud offered us the perfect foundation through solutions such as Compute Engine, Google Kubernetes Engine, BigQuery, and Google Cloud Storage. These acted as the right infrastructure components for us for the following reasons:

    • Google Cloud is globally accessible and supports compliance, helping to streamline security and regulatory processes for our team. 

    • With Google Cloud, we can manage our entire development and delivery processes globally with fast and efficient reconciliation of regional compliance requirements. 

    • When we need to adjust existing infrastructure or deliver new capabilities, Google Cloud accelerates the process and takes the heavy lifting off of our team. 

    • Google Cloud allows us to support tens of thousands of players on each of our apps while experiencing minimal downtime and low latency. The importance of this support can’t be underestimated in an industry where players have little to no patience if lags in games occur.

    We migrated our back-office IT stack alongside our consumer-facing production applications to Google Cloud given our positive experiences with compliance, security, scalability, and process management. This migration has significantly accelerated our digital transformation while streamlining our infrastructure for faster and more cost-effective performance.

    In many ways, Google Cloud has been, with maybe a pun intended, a game-changer for us. For instance, when we suddenly had to support a lot of remote work during the COVID-19 pandemic, native identity and access management tools in Google Cloud allowed us to retire costly VPNs used for backend access and quickly adopt a more easily managed, cost-effective zero-trust security posture.

    Accessing vital third-party partners and managed services

    Aristocrat has many IT needs best addressed in a multi-cloud environment. Google Cloud is particularly attractive given its strong cloud interoperability, as well as the many products and services available on Google Cloud Marketplace. The marketplace accelerated our deployment of key third-party apps including Splunk and Qualys.

    Given the personal information we store and the global regulatory compliance statutes we must oblige, security lies at the heart of our business. Splunk is a critical component of our digital transformation because it offers solutions that provide the enhanced monitoring capabilities and visibility we need. The integration between Splunk and Google Cloud gives us confidence that our data is secure. We know our data can be secure in Google Cloud, while simplified billing through Google Cloud Marketplace makes payments and license tracking easier for our procurement team.

    As part of our protected environment, we use the Splunk platform as our security information and event management system, leveraging the InfoSec app for Splunk that provides continuous monitoring and advanced threat detection to significantly improve our security.

    We can manipulate and present data in Splunk in a way that provides us with a single pane-of-glass for our hybrid, multi-cloud environment and our third-party apps and systems. Splunk observability tools have likewise helped us to track browser-based applications like our online gaming apps to monitor details related to security and performance.

    Splunk and Google Cloud have transformed how we operate. We can now quickly ingest and analyze data at scale within our refined approach to security management by offloading software management to Splunk and Google Cloud. This ability enables us to approach security more strategically, and positions us to integrate more AI/ML capabilities into our products for even greater governance and performance.

    This is just the beginning of our journey with Splunk and Google Cloud. We’re excited to see the innovation we can continue bringing to the gaming community worldwide.

  • Helping global governments and organizations adopt Zero Trust architectures Fri, 13 May 2022 13:00:00 -0000

    For more than a decade, Google has applied a Zero Trust approach to most aspects of our operations. Zero Trust’s core tenet–that implicit trust in any single component of a complex, interconnected system can create serious security risks–is fundamental to how we operate and build our security architecture. 

    Early in our security journey, we realized that despite our best efforts user credentials would periodically fall into the hands of malicious actors. This is why we developed the BeyondCorp framework. We needed additional layers of defense against unauthorized access that would not impede user productivity. We also understood that software that interacts with the larger world should not have a perimeter-based trust model. These realizations led to the layered protection in our BeyondProd framework, which extends the Zero Trust paradigm to our production workloads. 

    Earlier this year, the United States Office of Management and Budget (OMB) released a Federal Strategy to Move the U.S. Government Towards a Zero Trust Architecture. This marks an important step for the U.S. government’s efforts to modernize under Executive Order 14028 on Improving the Nation’s Cybersecurity. In parallel, guidance from the United Kingdom’s National Cyber Security Center (NCSC) has also called for the move to a Zero Trust approach to security and outlined in 2021 its report on Zero Trust architecture design principles.

    Adopting a Zero Trust approach can help organizations inside and outside the public sector stay ahead of both regulatory requirements and security threats, but it requires thoughtful planning and execution. Our goal is to bring the best practices for Zero Trust together in one place, leveraging the experiences and knowledge of our existing customers, and Google’s own experience with implementing Zero Trust. 

    How Google Cloud can help government agencies move toward Zero Trust

    Agencies can rely on Google Zero Trust capabilities for remote access, secure collaboration, and boundary security. To better serve the Zero Trust needs of our customers, we introduced BeyondCorp Enterprise in January 2021, a solution that provides Zero Trust secure access to resources and applications in the cloud and on-premises. BeyondCorp Enterprise was built based on years of Google’s own innovation as we implemented Zero Trust globally for ourselves. It leverages the Chrome browser and Google’s global network, and it offers integrated real-time threat and data protection.

    Here are five ways BeyondCorp Enterprise can be applied to help organizations adopt the Zero Trust cybersecurity principles set forth in the recent White House memorandum (M-22-09) and other global government guidance for Zero Trust. 

    1. Enable enterprise applications to be used over the public internet: It’s no secret that VPN usage poses daily burdens and long-term challenges for IT and cybersecurity managers, as well as end-users. BeyondCorp Enterprise provides users with seamless and secure access to web applications (including SaaS apps and apps hosted on any cloud), plus central management controls and threat and data protection capabilities, all built-in to the Chrome browser. Through BeyondCorp Enterprise, end-users can access applications simply and still benefit from enterprise-grade security, without sacrificing their productivity or user experience. 

    2. Leverage phishing-resistant MFA to access secure resources: Many cyberattacks start with phishing messages that lead users to infected websites and attempt to steal credentials. The use of phishing-resistant MFA, as recommended by M-22-09, can protect personnel from sophisticated online attacks. BeyondCorp Enterprise supports strong phishing-resistant authentication, by allowing factors such as Titan Security Keys to be used as attributes in access policies that are enforced at the application layer.

    Organizations can customize how to incorporate phishing-resistant MFA methods into their access policies for individual applications and resources. Phishing protection is also built into the Chrome browser, powered by Google Safe Browsing, and these capabilities block access to malicious content, detect phishing sites, prevent malware transfers, and generate reports of unsafe activity, adding even more protection against bad actors.

    3. Use context-aware authorization: The U.S. federal strategy states that a Zero Trust architecture should incorporate more granularly and dynamically defined permissions and that every request for access should be evaluated to determine whether it is appropriate. With context-aware authorization, organizations can build and customize access policies to include different contextual signals about a user including their role, their location, and even the time of day. Every interaction between a user and a BeyondCorp-protected resource is evaluated in real-time against the resource’s access policy to ensure users are and remain authorized to access it, with continuous authorization for all interactions at a per request level. 

    4. Incorporate device-level signal into authentication: At Google, we believe that trust must be granted based on what is known about a user’s identity and their device. We are pleased that OMB similarly recommends that authentication incorporate at least one device-level signal alongside identity information. Since BeyondCorp Enterprise supports device-level attributes without requiring users to install agents, this can be done easily by leveraging the Endpoint Verification extension in the Chrome browser, where administrators can gather endpoint security posture information and easily construct and implement granular resource access policies. The ability to collect and utilize this information through an agentless approach is especially helpful for BeyondCorp Enterprise customers who support a workforce with bring-your-own-device policies or unmanaged devices.

    5. Include the extended workforce in your Zero Trust strategy: A Zero Trust approach aimed to provide secure access to the right users, at the right time, and for the right purposes should be inclusive of all users, not just full-time staff. Government agencies rely on contractors and partners to carry out many important missions. Unfortunately, the extended workforce is often more vulnerable to attacks if they are given too much privileged access or if their security practices are not properly assessed before access is provisioned. At the same time, federal administrators can’t always manage third-party devices or software directly, which can make secure access challenging. 

    BeyondCorp Enterprise supports a feature called protected profiles, an ideal solution for granting Zero Trust access to the extended workforce. It enables users to securely access resources from unmanaged devices and be protected by the same security capabilities without needing to install agents. Furthermore, administrators can gain visibility into risky activities and view any security events that are generated from within protected profiles.

    Applying the NCSC Zero Trust principles on Google Cloud

    Last year, the U.K. government’s NCSC launched its Zero Trust architecture design principles to help organizations securely adopt a Zero Trust architecture. To help private and public sector organizations in the U.K., the Google Cybersecurity Action Team (GCAT) released a detailed research paper that outlines how organizations can leverage Google Cloud technologies and services to align with these principles. This is a technical guide aimed at enterprise and security architects charged with developing and executing a Zero Trust strategy under the principles outlined by the NCSC, including: 

    • Know your architecture, including users, devices, services and data with Google Cloud Professional Services Organization (PSO) who can support discovery, planning and risk mitigation.

    • Know your User, Service and Device identities including reference architectures for Cloud Identity.

    • Assess your user behavior, device and service health by leveraging built in reporting from Google Cloud and Chronicle.

    • Use policies to authorize requests with BeyondCorp Enterprise policy-based authorization. 

    • Authenticate & Authorize everywhere by reviewing the BeyondCorp and BeyondProd frameworks which combine to deliver ubiquitous authentication and authorization.

    • Focus your monitoring on users, devices and services with device management and Cloud native monitoring capabilities. 

    • Don't trust any network, including your own. Review details on Google’s Secure by Design infrastructure.

    • Choose services designed for Zero Trust. Review how to protect modern and legacy applications with BeyondCorp. 

    For more detail on how we’re supporting the U.K.’s NCSC, please review our recent research paper for insight into their priorities, and where Google will be discussing Secure by Design principles and how to respond to security incidents. 

    Zero Trust assessment and planning services for organizations

    Organizations that are managing complex environments while undergoing Zero Trust adoption could benefit strongly from experienced support and guidance. The Google Cybersecurity Action Team (GCAT) is committed to helping customers meet Zero Trust security and compliance requirements in the cloud through specialized consulting engagements and workshops for public sector customers. Read more about how growing cybersecurity requirements for U.S. federal government customers via executive orders and White House mandates are being supported through Google Cloud solutions.

    GCAT’s multi-week Zero Trust Foundations engagement helps organizations build a strategy to achieve a Zero Trust security model across their operations. Zero Trust Foundations is co-delivered by Google Cloud’s Office of the CISO and our public sector Professional Services Organization. It can help focus and accelerate customers’ Zero Trust efforts by sharing lessons learned from Google’s own BeyondCorp zero-trust journey, and our global implementation of defense-in-depth best practices. Contact us today to learn more.

    To learn more about ways Google Cloud can help organizations embarking on a Zero Trust journey, tune into our second annual Google Cloud Security Summit on May 17 and hear directly from customers who are already using our Zero Trust solutions to achieve their organization’s security goals. 


    About the Authors
    Jeanette Manfra is the former Assistant Director for the Cybersecurity and Infrastructure Security Agency at the Department of Homeland Security. Dan Prieto is the former Director of the Defense Industrial Base Cybersecurity program at the Department of Defense. Both Dan and Jeanette also served in the White House on the staff of the National Security Council's cybersecurity directorate.

    Related Article

    Your guide to sessions at Google Cloud Security Summit 2022

    Here’s a helpful guide to sessions at Google Cloud’s Security Summit 2022

    Read Article
  • Introducing Open Source Insights data in BigQuery to help secure software supply chains Thu, 12 May 2022 20:45:00 -0000

    Today we're announcing a new Google Cloud Dataset from Open Source Insights which will help developers better understand the structure and security of the software they use. This dataset provides access to critical software supply chain information for developers, maintainers and consumers of open-source software.

    Your users rely not only on the code you write, but also on the code your code depends on, the code that code depends on, and so on. This web of dependencies forms a dependency graph, and while each node in the graph brings useful functionality to your project, they may also introduce security vulnerabilities, licensing issues, or other surprises, as recent events like the log4j issue demonstrated. To understand your code, you must have an accurate view of its dependency graph.

    The Open Source Insights project scans millions of open-source packages from the npm, Go, Maven, PyPI, and Cargo ecosystems, computes their dependency graphs, and annotates those graphs with security advisories, license information, popularity metrics, and other metadata. The dataset is regularly updated, keeping it current and relevant while also providing a snapshotted view of change over time. Generated by resolving each package’s dependency constraints, this data provides precise, accurate, and actionable dependency graphs.

    The rate of change in open-source packages is significant. Our analysis shows that roughly 15% of the packages in npm see changes to their dependency sets each day, and for 40,000 of those packages (2% of packages in npm) this results in a change to their license or advisory set. Keeping up with these changes is critical yet intractable without good tooling.

    This new dataset allows anyone to use Google Cloud BigQuery to explore and analyze the dependencies, advisories, ownership, license and other metadata of open-source packages across supported ecosystems, and how this metadata has changed over time.

    We are eagerly looking forward to seeing how this data will be used. Whether you’re a developer, security engineer, or researcher, you can use this public dataset to analyze components of your software supply chain, and integrate this information with your existing tools and pipelines. 

    How the Open Source Insights dataset works

    We’re bringing Google’s mission to “organize the world’s information and make it universally accessible and useful” to open-source software. Open Source Insights examines each package in the packaging systems we cover, including npm, Go, Maven (Java), PyPI (Python), and Cargo (Rust) and more to come. A full, detailed graph of its dependencies and their properties is constructed and annotated with security advisory, license, owner, release information and other metadata, making a rich dataset covering entire package management language ecosystems. 

    The dataset is updated regularly, making this a valuable resource for tracking ecosystem level changes over time, analyzing the scope and impact of issues, or integrating into custom dashboards and build systems.

    Getting started with the Open Source Insights dataset

    To begin exploring these public dataset tables, you can look at the schema and try some sample queries, like the following examples. As with all other Google Cloud Datasets, users can obtain access without charges of up to 1TB/month in queries and up to 10GB/month in storage through BigQuery’s free tier. SQL queries above these thresholds are subject to regular BigQuery pricing. Users can also leverage the BigQuery sandbox to access BigQuery without the need to create a Google Cloud account or provide credit card information, subject to the sandbox's limits and BigQuery’s free tier thresholds. 

    What are the most common licenses across each ecosystem?

    We can aggregate the license data across packages within each dependency management system to get a list of the top three licenses per system. To do so we first find the newest snapshot in the dataset. Then within that snapshot we count the number of unique packages with at least one version using each license (multiple versions of a package are not double counted).

    code_block
    [StructValue([(u'code', u'-- Find the most recent snapshot.\r\nDECLARE\r\n Time TIMESTAMP DEFAULT (\r\n SELECT\r\n MAX(Time)\r\n FROM\r\n `bigquery-public-data.deps_dev_v1.Snapshots`);\r\n\r\nWITH\r\n -- Compute the count of unique packages per system and license.\r\n Counts AS (\r\n SELECT\r\n System,\r\n License,\r\n COUNT(DISTINCT Name) AS NPackages\r\n FROM\r\n `bigquery-public-data.deps_dev_v1.PackageVersions`\r\n CROSS JOIN\r\n UNNEST(Licenses) AS License\r\n WHERE\r\n SnapshotAt = Time\r\n GROUP BY\r\n System,\r\n License),\r\n -- Compute a rank for each license within its system\r\n Ranked AS (\r\n SELECT\r\n System,\r\n License,\r\n NPackages,\r\n ROW_NUMBER() OVER (PARTITION BY System ORDER BY NPackages DESC ) AS LicenseRank\r\n FROM\r\n Counts)\r\n\r\n-- Finally output the top 3 per system.\r\nSELECT\r\n System,\r\n License,\r\n NPackages\r\nFROM\r\n Ranked\r\nWHERE\r\n LicenseRank <= 3\r\nORDER BY\r\n System,\r\n LicenseRank;'), (u'language', u'')])]
    1 Open Source Insights.jpg

    What are the most depended upon package versions?

    We can use the dependency graphs to identify the most depended upon package versions in the cargo ecosystem. To do so, we filter all packages and available versions for just the release with the highest semantic version per package. We then sum the number of these highest release versions that depend on each version.

    code_block
    [StructValue([(u'code', u"-- The dependency management system whose packages we will query. \r\nDECLARE\r\n Sys STRING DEFAULT 'CARGO';\r\n\r\n-- Find the most recent snapshot.\r\nDECLARE\r\n Time TIMESTAMP DEFAULT (\r\n SELECT\r\n MAX(Time)\r\n FROM\r\n `bigquery-public-data.deps_dev_v1.Snapshots`);\r\n\r\nWITH\r\n -- Select just the package-versions that are considered releases\r\n -- in the system of interest.\r\n Releases AS (\r\n SELECT\r\n Name,\r\n Version,\r\n VersionInfo\r\n FROM\r\n `bigquery-public-data.deps_dev_v1.PackageVersions`\r\n WHERE\r\n SnapshotAt = Time\r\n AND VersionInfo.IsRelease\r\n AND System = Sys),\r\n -- For each package, find its release with the highest version number.\r\n HighestReleases AS (\r\n SELECT\r\n Name,\r\n Version\r\n FROM (\r\n SELECT\r\n Name,\r\n Version,\r\n ROW_NUMBER() OVER (PARTITION BY Name ORDER BY VersionInfo.Ordinal DESC) AS RowNumber\r\n FROM\r\n Releases)\r\n WHERE\r\n RowNumber = 1)\r\n\r\n-- Finally compute the number of dependents per package-version and\r\n-- rank package-versions by this count in descending order.\r\nSELECT\r\n D.Dependency.Name,\r\n D.Dependency.Version,\r\n COUNT(*) AS NDependents\r\nFROM\r\n `bigquery-public-data.deps_dev_v1.Dependencies` AS D\r\nJOIN\r\n HighestReleases AS H\r\nON\r\n H.Name = D.Name\r\n AND H.Version = D.Version\r\nWHERE\r\n D.SnapshotAt = Time\r\n AND D.System = Sys\r\nGROUP BY\r\n D.Dependency.Name,\r\n D.Dependency.Version\r\nORDER BY\r\n NDependents DESC\r\nLIMIT\r\n 10;"), (u'language', u'')])]
    2 Open Source Insights.jpg

    What’s next for software supply chain security?

    We hope this dataset will make it easier for developers to learn more fundamental information about their dependencies. You can also explore the Open Source Insights website for the latest open-source software insights and visualizations, learn more about our open source security and software supply chain security solutions at the upcoming Google Cloud Security Summit on May 17.

    Related Article

    With software supply chain security, developers have a big role to play

    Developers need to demonstrate a secure software supply chain in order to comply with regulations and keep their organization out of the ...

    Read Article
  • Sharpen your machine learning skills at Google Cloud Applied ML Summit Thu, 12 May 2022 17:00:00 -0000

    Artificial intelligence (AI) and particularly machine learning (ML) continue to advance at breakneck pace. 

    We see it throughout projects and commentaries across the broader technology industry. We see it in the amazing things our customers are doing, from creating friendly robots to aid childhood development, to leveraging data for better manufacturing and distribution, to fostering internal innovation through hackathons. And we see it in our own research and product development at Google, from improved machine learning models for our Speech API, to integrations that streamline data management and ML modeling, to making AlphaFold (DeepMind’s breakthrough protein structure prediction system) available to researchers throughout the world using VertexAI

    At Google Cloud, we’ve helped thousands of companies to accelerate their AI efforts, empower their data scientists, and extend the ability to build AI-driven apps and workflows to more people, including those without data science or ML expertise. Next month, we’ll take the next step in this journey with our customers, at Google Cloud Applied ML Summit

    Join us June 9 for this digital event, which will bring together some of the world’s leading ML and data science professionals to explore the latest cutting-edge AI tools for developing, deploying, and managing ML models at scale. 

    On-demand sessions kick off at 9:00 AM Pacific with “Accelerating the deployment of predictable ML in production,” featuring VP & GM of Google Cloud AI & Industry Solutions Andrew Moore; Google Cloud Developer Advocate Priyanka Vergadia; Ford Director of AI and Cloud Bryan Goodman; and UberAI Director of Engineering Smitha Shyam.

    At the summit, you’ll learn how companies like General Mills, Vodafone, H&M, and CNA Insurance are developing, deploying, and safely managing long-running, self-improving AI services. Get insights in practitioner sessions where you can find new ways to:

    • Train high-quality ML models in minutes with AutoML innovations born of the latest Google Brain research, explored in the session “End-to-end AutoML for model prep.”

    • Make the most of your Google Cloud investments in Vertex AI Training and Vertex AI Prediction to help you deploy custom models built on TensorFlow, PyTorch, scikit-learn, XGBoost, and other frameworks. Check out the session “ML prediction and serving: Vertex AI roadmap.”

    • Streamline the process to audit, track, and govern ML models as they adapt to live data within a dynamic environment, without degrading performance. Dive into this topic  in the session “Model governance and auditability.” 

    You can choose from over a dozen sessions across three tracks: “Data to ML Essentials,” “Fact-track Innovation,” and “Self-improving ML.” Session topics range from MLOps best practices, to Google Cloud customer experiences, to the importance of model auditability, and explainable and responsible AI, with multiple customer panels and “ask me anything” sessions to help you get the insights and develop the skill to take your business’s ML efforts to the next level.

    We’re committed to continuing to serve our customers in this rapidly-evolving space, and we’re excited to learn and collaborate with you at this event. To register, visit this link to reserve your seat for the Applied ML  Summit.

    Related Article

    Unified data and ML: 5 ways to use BigQuery and Vertex AI together

    Vertex AI is a single platform with every tool you need to build, deploy, and scale ML models. Get started quickly with five easy integra...

    Read Article
  • Your guide to sessions at Google Cloud Security Summit 2022 Thu, 12 May 2022 17:00:00 -0000

    Google Cloud Security Summit is just a few days away! We have an exciting agenda with a keynote, demo, and breakout sessions across four tracks - Zero Trust, Secure Software Supply Chain, Ransomware & Emerging Threats, and Cloud Governance & Sovereignty. By attending this summit, you will be the first to learn about new products and advanced capabilities we are announcing from Google Cloud security and discover new ways to define and drive your security strategy and solve your biggest challenges.

    We hope you’ll join us for the Security Summit digital online event on May 17, 2022, to learn from experts, explore the latest tools, and share our vision for the future of security. Register here for the event and watch the sessions live and on-demand. If you are in Europe, the Middle East, or Africa please visit the EMEA page to view summit events in your time zone and captions in your local language.

    Security Summit Keynote

    Charting a safer future with Google Cloud

    Featured Speakers:
    Chris Inglis, National Cyber Director, Executive Office of the President White House
    Jonathan Meadows, Head of Cloud Cyber Security Engineering, Citibank
    Sunil Potti, General Manager and Vice President of Cloud Security, Google Cloud

    Cybersecurity remains at the top of every organization’s agenda. Join our opening keynote to hear how Google Cloud’s unique capabilities and expertise can help organizations, large and small, in the public or private sector, address today’s most prominent security challenges and imperatives: Zero Trust, Securing the Software Supply Chain, Ransomware and other emerging threats, Cloud governance and Digital Sovereignty. Whether you use our trusted cloud for digital transformation, or continue to operate on-premise or in other clouds, you’ll learn how we can help you be safer with Google.

    Demo

    Modern threat detection, investigation, and response with Google Cloud’s SecOps suite

    Featured Speakers:
    Arnaud Loos, Customer Engineer, Google Cloud
    Svetla Yankova, Head of Customer Engineering, Google Cloud

    To stay secure in today’s growing threat landscape, organizations must detect and respond to cyber threats at unprecedented speed and scale. This demonstration will showcase Google Cloud’s Security Operations Suite, and its unique approach to building modern threat detection, investigation and response.

    Breakout Sessions

    We have 19 breakout sessions that include sessions from Google speakers, our customers, and partners. The breakout sessions are available across four different tracks covering Zero Trust, Secure Software Supply Chain, Ransomware & Emerging threats, and Cloud Governance and Sovereignty.


    Zero Trust Track 

    1. How Google is helping customers move to Zero Trust

    Featured Speakers:
    Aman Diwakar, Security Engineering Manager - Corporate Security, Door Dash
    Jeanette Manfra, Senior Director, Risk and Compliance, Google Cloud
    Tanisha Rai, Product Manager, Google Cloud

    Enterprises around the globe are committed to moving to a Zero Trust architecture, but actually making that happen can be hard. Every day, we hear from customers asking how they can set up a Zero Trust model like Google’s, and we are here to help. Tune in to this session to hear speakers discuss how Google did it and how we can now help you with a comprehensive set of products, advisory services, and solutions. Whether you’re “born in the cloud,” a government agency looking to meet federal directives, or somewhere in between, Google Cloud products like BeyondCorp Enterprise and our set of partner solutions can help you jump-start your Zero Trust approach.

    2. A look ahead: the future of BeyondCorp Enterprise

    Featured Speakers:
    Prashant Jain, Product Manager, Google Cloud
    Jian Zhen, Product Manager, Google Cloud

    Google pioneered Zero Trust. Now we’re pioneering rapid Zero Trust transformation. We know one size does not fit all and Zero Trust capabilities should conform to your needs – not vice versa. Join this session to learn more about how BeyondCorp Enterprise enables you to quickly and flexibly apply a Zero Trust approach to meet your application use cases and security requirements. Hear from product leaders as they share updates on new BeyondCorp capabilities, partnerships, and integrations that enable you to deliver rapid wins and avoid drawn out deployment projects.

    3. CrowdStrike and Deloitte: Managing cloud migration, remote workforce, and today's threats

    Featured Speakers:
    Chris Kachigian, Sr. Director, Global Solutions Architecture, CrowdStrike
    Mike Morris, Detect and Respond CTO, Head of Engineering, Deloitte
    McCall McIntyre, Strategic Technology Partner Lead, Google Cloud

    Your organization is in the cloud migration journey, you have a remote or hybrid workforce and your extended infrastructure is more dependent than ever on disparate devices, partners and apps. To make things even more complicated, threat actors are targeting you in all of these facets, causing business disruption. How can you secure this new extended environment without negatively impacting user productivity? Join this Lightning Talk to learn more about how CrowdStrike and Deloitte have helped customers solve for just that. 

    4. Working safer with Google Workspace

    Featured Speakers:
    Neil Kumaran, Product Lead, Gmail & Chat Security & Trust, Google Cloud
    Nikhil Sinha, Sr. Product Manager, Workspace Security, Google Cloud

    Google Workspace is on a mission to make phishing and malware attacks a thing of the past. Google keeps more people safe online than anyone else in the world. According to our research, Gmail blocks more than 99.9% of malware and phishing attempts from reaching users’ inboxes. We do this by using our expertise protecting against threats at scale to protect every customer by default. This session will provide an overview of how Google Workspace layered, AI powered protections function across Gmail, Docs, Sheets, Slides, and Drive. We’ll examine real-life examples of large malware attacks to showcase how advanced capabilities like sandboxing, deep-learning-based malicious document classification, and performant, deep antivirus protections work to help stop threats. 

    5. Securing IoT devices using Certificate Authority Service

    Featured Speakers:
    Sudhi Herle, Director, Engineering & Product Management, Android Platform Security, Google Cloud
    Anoosh Saboori, Product Manager, Google Cloud
    Mahesh Venugopala, Director Security, Autonomic

    Scaling security for IoT devices can be challenging. As the IoT market continues to grow, it is imperative that strong security measures are put into place to protect the information these devices send to the cloud. Join this session to learn how Google customers can leverage capabilities such as Certificate Authority Service to apply Zero Trust principles to secure IoT devices.


    Secure Software Supply Chain Track

    6. Building trust in your software supply chain

    Featured Speakers:
    Nikhil Kaul, Head of Product Marketing - Application Modernization, Google Cloud
    Victor Szalvay, Outbound Product Manager, Google Cloud

    Whether you’re building an application on Kubernetes, or in a serverless or virtual machine environment, end-to-end security is critical for mitigating the vulnerabilities lurking within open source software, as well as those related to recent cybersecurity attacks and data breaches. Come learn how you can meet guidelines from the U.S. government and adopt an in-depth, security-first approach with Google Cloud that embeds security at every step of your software life cycle. 

    7. Protecting and securing your Kubernetes infrastructure with enterprise-grade controls

    Featured Speaker: 
    Gari Singh, Product Manager, Google Cloud

    Kubernetes is not just a technology. It’s also a model for creating value for your business, a way of developing apps and services, and a means to help secure and develop cloud-native IT capabilities for innovation. Google Kubernetes Engine (GKE) allows your developers to spend less time worrying about security and to achieve more secure outcomes. In this session, learn how you can set up enterprise-grade security for your app right out of the box. We’ll cover the latest security controls, hardened configuration, and policies for GKE, including confidential computing options. 

    8. Managing the risks of open source dependencies in your software supply chain

    Featured Speaker:
    Andy Chang, Group Product Manager, Google Cloud

    Open-source software code is available to the public – free for anyone to use, modify, or inspect. But securing open-source code, including fixing known vulnerabilities, is often done on an ad hoc, volunteer basis. Join this session to learn how our new Google Cloud solution addresses open-source software security.


    Ransomware and Emerging Threats Track

    9. A holistic defense strategy for modern ransomware attacks

    Featured Speaker:
    Adrian Corona, Head of Security Solutions GTM, Google Cloud

    Making your organization resilient against modern ransomware attacks requires holistic detection, protection, and response capabilities. In this session, we’ll demonstrate how you can apply a cyber resilience framework, and products from Google Cloud and partners, to help thwart threats and combat ransomware attacks.

    10. Taking an autonomic approach to security operations

    Featured Speakers: 
    Anton Chuvakin, Head of Security Solution Strategy, Google Cloud
    Iman Ghanizada, Head of Autonomic Security Operations, Google Cloud

    Security operations centers are constantly pressed for time. Analysts seldom have the luxury to “clear the board” of active attacks and, as a result, can often feel overwhelmed. In this talk, we’ll show you how you can turn the tide and leverage Chronicle and Siemplify to prioritize and automate your SecOps, giving analysts valuable time back to focus on the threats that matter.

    11. Insight and perspective from the Unit 42 Ransomware Threat Report 

    Featured Speakers:
    Joshua Haslett, Strategic Technology Partnership Manager, Google Cloud
    Josh Zelonis, Field CTO and Evangelist, Palo Alto Networks

    Ransomware groups turned up the pressure on their victims in 2021, demanding higher ransoms and using new tactics to force them into paying.In fact, the average ransomware demand in cases handled by Unit 42 in 2021 climbed 144% since 2020. At the same time, there was an 85% increase in the number of victims who had their names and other details posted publicly on dark web “leak sites'' that ransomware groups use to coerce their targets. As the ransomware landscape continues to evolve, and threat actors leverage new creative techniques to cripple business operations, what can your organization do to prepare and stay ahead of threats? Join us for this presentation as we discuss the key findings in our 2022 Unit 42 Ransomware Threat Report. 

    12. Cloud-native risk management and threat detection with Security Command Center

    Featured Speakers:
    Thomas Meriadec, Head of Cloud Platforms Security & Compliance, Veolia
    Tim Wingerter, Product Manager, Google Cloud

    As organizations move to the cloud, continuous monitoring of the environment for risk posture and threats is critical. In this session, learn how Security Command Center Premium provides risk management and threat detection capabilities to help you manage and improve your cloud security and risk posture. Join us to hear about Veolia’s experience with Security Command Center Premium.

    13. Securing web applications and APIs anywhere

    Featured Speakers:
    Shelly Hershkovitz, Product Manager, Apigee API Security, Google Cloud
    Gregory Lebovitz, Product Management, Cloud Network Security, Google Cloud

    Application attack vectors are increasing rapidly, and many organizations seek to  protect against the different types of application and API attacks. Join this session to learn how Google Cloud can help protect and secure applications and APIs from fraud, abuse, and attacks – such as DDoS, API abuse, bot fraud, and more – using our Web App and API Protection (WAAP) offering.

    14. Maximizing your detection & response capabilities

    Featured Speakers:
    Magali Bohn, Director, Partnerships and Channels GSEC, Google Cloud
    Brett Perry, CISO, Dot Foods
    Jason Sloderbeck, Vice President, Worldwide Channels, CYDERES

    Join Google Cloud, Cyderes (Cyber Defense and Response), and Dot Foods as we discuss best practices and real-world use cases that enable a company to detect threats and respond to incidents in real-time. Learn their autonomic security operations journey and how they’ve scaled a robust, cost-efficient program to accelerate their digital transformation and overall growth. 


    Cloud Governance & Sovereignty Track

    15. Achieving your digital sovereignty with Google Cloud

    Featured Speaker:
    Dr. Wieland Holfelder, Vice President Engineering, Google Cloud

    Google Cloud’s unique approach, which includes strong local partnerships, helps organizations balance transparency, control, and the ability to survive the unexpected – on a global scale. Join this session to learn how you can meet current and emerging digital sovereignty goals. 

    16. Compliance with confidence: Meeting regulatory mandates using software-defined community clouds

    Featured Speakers:
    Bryce Buffaloe, Product Manager Security & Compliance, Google Cloud
    Jamal Mahboob, Customer Engineer, Google Cloud

    Adopting the cloud in regulated industries can require constraints for data residency, and the need for support and specific security controls. Learn how Google Cloud can help provide assurances without the strict physical infrastructure constraints of legacy approaches, enabling organizations to benefit from cloud innovation while meeting their compliance needs.

    17. Demystifying cyber security analytics - Scalable approaches for the real world

    Featured Speakers:
    Philip Bice, Global Lead - Service Provider Partnerships, Google Cloud
    Chris Knackstedt, Sr. Manager / Data Scientist, Deloitte & Touche LLP

    In this session, join security leaders from Deloitte & Touche LLP and Google Cloud for an insightful conversation on key trends and challenges warranting the need for scalable, flexible and predictive security analytics solutions for today’s hybrid, multi cloud technology environments. The speakers will share practical approaches to designing and deploying use case-driven security analytics by leveraging the power of Google Cloud native data management and analytics services. The session will also cover solutions and managed services offered jointly by Deloitte and Google Cloud that can help organizations maintain their competitive differentiation and continually accelerate cyber security maturity.

    18. Best practices for defining and enforcing policies across your Google Cloud environment

    Featured Speakers:
    Vandhana Ramadurai, Sr. Product Manager, Google Cloud
    Sri Subramanian, Head of Product, Cloud Identity and Access Management, Google Cloud

    Learn how to take a policy-driven approach to governing your cloud resources. In this session, we’ll cover best practices that enable organizations to shift from remediating resources that violate requirements to a more proactive state for preventing those violations.

    19. A comprehensive strategy for managing sensitive data in the cloud

    Featured Speakers:
    Nelly Porter, Group Product Manager, Google Cloud
    Matt Presson, Lead Security Architect, Product Security, Bullish

    Data is a big asset and a big risk, and classification and protection of it is an important task for organizations. In this session, learn how you can leverage Google security tools to more effortlessly take back control of your data.

    In addition to these sessions, there will be on-demand videos and demos published on May 17 that you can watch at your convenience by visiting the Security Summit page. We can’t wait for you to join us and learn all things Security at Google Cloud Security Summit!

    Related Article

    Cloud CISO Perspectives: April 2022

    Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.

    Read Article
  • Building connection in the era of hybrid work Thu, 12 May 2022 16:00:00 -0000

    In response to the pandemic, an overwhelming 92% of the U.S. workforce is now interested in working in a hybrid or fully remote capacity1. All employers, including government leaders, are now critically focused on finding the right tools and techniques to foster engagement, improve productivity, and strengthen relationships – in-person and virtually – or risk losing and/or retaining top talent.

    Creating a positive work environment starts with having the right technology that can provide a variety of meeting tools to make work easier and more productive from any location. Google Workspace enables agency leaders to support their workforce with capabilities to address a variety of needs, from virtual training and learning, to mobilizing emergency response and critical service delivery. Below we’ve highlighted some of our top tips for getting the most out of Google Workspace to make hybrid work more efficient and effective. 

    Capture meeting agendas in a Google Doc

    Create more effective, engaging meetings by attaching an agenda doc to your Google Calendar event and invite your attendees in to comment and review ahead of time. Google Docs, Google Slides, and Google Sheets allow collaborators to leave color-coded comments that can be incorporated into a live video meeting or a face-to-face roundtable meeting.

    jobs hybrid work

    Engage attendees with interactive features in Google Meet

    We’ve added new features to make meetings more interactive and inclusive throughout 2022. New in-meeting reactions, livestream Q&A and polls, and streaming meetings directly to YouTube, and client-side encryption are all coming to Google Meet soon. We’re incorporating Meet directly into Docs, Sheets, and Slides to make it easier to coordinate on projects.  

    Knowing how to include everyone into a meeting, even while some folks are remote and some are in the office, will be critical to bringing cohesion to the work experience. Government leaders can foster a culture that helps workers be more productive, satisfied, and perhaps even happier with their work lives.

    Add focus time to your calendar and see how time is spent

    Google Workspace has also introduced product features to promote wellness. This includes focus time on Google Calendar, out of office and do not disturb statuses on Google Chat, or even backgrounds and noise cancellation on Google Meet that allow individuals to separate their home from work. Allowing employees to set boundaries or notify others of working times can help combat work fatigue.  Time Insights in Calendar provide individuals with personalized analytics to see how time is spent across meetings and collaborators throughout a week. A secure cloud environment further brings these tools together to create systems of collaboration.

    Timeinsights
    Calendar week view with Time Insights

    Creating positive outcomes with a culture focused on integrated work environments

    The shift to hybrid work has already started leading to more meaningful engagements among all employees. Video calls have connected the new, tech savvy generation of workers with more experienced workers. In a Fedscoop Radio podcast on embracing hybrid-work strategies and tools, Gary Dannoff, Global Google Workspace Strategic Alliance Leader, noted that multigenerational teams with a 25-year age gap between team members are 75% more likely to have positive outcomes than teams with a smaller age span or none. As we return to the office, let’s engage with and support this new culture of cross-cultural, cross-generational, and cross-geographic collaborative environments created by the challenges we all adapted to during the pandemic.

    Google Workspace supports a shift toward future-ready workplace cultures while providing the empathy and confidence needed for returning employees. Danoff also explains that to create a "future-proof" culture, you need to shift your mindset towards building a workplace that is future-ready. Using modern technology to foster collaboration in an evolving work environment helps organizations transform how work is done. 

    Collaborative technology facilitates productivity and constituent engagement

    Agencies around the country are embracing the new era of hybrid work. The State of Arizona migrated 36,000 employees to Google Workspace with little disruption to the way people did their jobs. Now, Arizona is planning to expand their platform to support even more departments that require highly regulated data. 

    To help constituents get back to work, states like Rhode Island and Wisconsin have created Virtual Career Centers that leverage Google Cloud solutions, which includes Google Workspace, where constituents can schedule meetings and video conferences with career coaches, develop and collaborate on resumes using cloud-based document storage, communicate directly with job recruiters, attend virtual job fairs, and apply to open positions.

    For more ideas on how to bring a collaborative, innovative culture into the workplace, catch up on content from the Google Workspace Summit.

    1 Source:  https://www.pwc.com/us/en/library/covid-19/us-remote-work-survey.html#content-free-1-0f39

  • Our I/O 2022 announcements: In demo form Thu, 12 May 2022 15:30:00 -0000

    In the Cloud PA Keynote at I/O Aparna Sinha walked through the backend for an application that connects volunteers with volunteer opportunities in their area. In this blog post we'll walk through each component of that application in a bit more detail, explaining the new products that Google Cloud has released, the pros and cons of the architecture we chose, and other nerdy technical details we didn't have time for in the talk. 

    But first, some architecture diagrams. The application we discussed in the keynote helps connect volunteers with opportunities to help. In the keynote we highlighted two features of the backend for this application: the comment processor and the geographical volunteer-to-opportunity matching functionality. 

    The text processing feature takes free form feedback from users and uses ML and data analytics tools to route the feedback to the team that can best address that feedback. Here's the architecture diagram for that backend. 

    IOArch

    The "opportunities near me" feature allows us to help users find volunteer opportunities near a given location. Here's the architecture diagram for that feature. 

    IOArch2

    Text Feedback Processing 

    Let's start by diving into the text processing pipeline. 

    The text feedback processing engine runs on a Machine Learning model, more specifically a text classifier (task part of the Natural Language Processing area). As for many machine learning scenarios, the first step was to collect users' feedbacks and synthetize a dataset with those feedbacks and a label to define each feedback as part of a category of feedbacks - Here were used "feedback", "billing_issues" and "bug" as possible categories. By the end of this dataset creation step the dataset structure looked like:

    user review | category

    <...>

    Too much spam. Stuff that I don't care for pops up on my screen all the time | feedback

    It works okay But I did not consent to subscribing at $28/year subscription | billing_issue

    I have bought it yet it displays ERROR IN VERIFYING MY ACCOUNT | bug

    <...>

    Having this dataset ready, it was imported on Vertex AI datasets - for details on how to create a text dataset on Vertex AI, take a look in this guide. The imported dataset could be seen on Vertex AI datasets, including the available feedback categories and number of samples for each category inside the dataset:

    IOVertexAIDataSet
    Click to enlage

    The next step, once the dataset is ready, to create the text classification model was to use Google AutoML. AutoML allows us to train a model with no code, just a few simple steps that can be started directly from the Vertex AI dataset page. 

    NewModelDialog

    We followed AutoML's default suggestions, including using the default values for how to split the dataset: 80% for training, 10% for validation, and 10% for testing. AutoML did all the model training and optimization automatically and notified us by email when the training was complete.

    TrainNewModel

    When training was complete, we double checked the model in the Vertex AI console to make sure everything looked good. 

    Evaluate
    Click to enlarge

    To enable other members of our team to use this model, we deployed it as a Vertex AI endpoint. The endpoint exposes the model via a REST API which made it simple to use for the members of our team that aren't experts in AI/ML.

    Once it is deployed, it is ready to be used by following the directions from Get online predictions from AutoML models.

    Once we had our model we could hook up the entire pipeline. Text feedback is stored in the Firebase Realtime Database. To do advanced analytics on this data, we wanted to move it to BigQuery. Luckily, Firebase provides an easy, code free, way to do that, the Stream Collections to BigQuery extension. Once we had that installed I was able to see the text feedback data in BigQuery in real time. 

    We wanted to classify this data directly from BigQuery. To do this, we built out a Cloud Function to call the Vertex AIendpoint we had just created and used BigQuery’s remote function feature. This Vertex AI endpoint contains a deployed model we previously trained to classify user feedback using AutoML Natural Language Processing.

    We deployed the Cloud Function and then created a remote UDF definition on BigQuery, allowing us to call the Cloud Function from BigQuery without having to move the data out of BigQuery or using additional tools. The results were then sent back to BigQuery where it was displayed in the query result with the feedback data categorized.

    code_block
    [StructValue([(u'code', u'def predict_classification(calls):\r\n # Vertex AI endpoint details\r\n client = aiplatform.gapic.PredictionServiceClient(client_options=client_options)\r\n endpoint = client.endpoint_path(\r\n project=project, location=location, endpoint=endpoint_id\r\n )\r\n # Call the endpoint for each\r\n for call in calls:\r\n content = call[0]\r\n instance = predict.instance.TextClassificationPredictionInstance(\r\n content=content,\r\n ).to_value()\r\n instances = [instance]\r\n parameters_dict = {}\r\n parameters = json_format.ParseDict(parameters_dict, Value())\r\n response = client.predict(\r\n endpoint=endpoint, instances=instances, parameters=parameters\r\n )'), (u'language', u'')])]

    Once the feedback data is categorized, using our ML model, we can then route the feedback to the correct people. We used Cloud Run Jobs for this, since it is designed for background tasks like this one. Here's the code for a job that reads from BigQuery and creates a github issue for each piece of feedback labeled bug_report.

    code_block
    [StructValue([(u'code', u'def create_issue(body, timestamp):\r\n title = f"User Report: {body}"\r\n response = requests.post(\r\n f"https://api.github.com/repos/{GITHUB_REPO}/issues",\r\n json={"title": title, "body": f"Report Text: {body} \\n Timestamp: {timestamp}", "labels": ["Mobile Bug Report", "bug"]},\r\n headers={\r\n "Authorization": f"token {GITHUB_TOKEN}",\r\n "Accept": "application/vnd.github.v3+json"\r\n }\r\n )\r\n response.raise_for_status()\r\n \r\nbq = bigquery.client.Client()\r\ntable = bq.get_table(TABLE_NAME)\r\n \r\nsql = f"""SELECT timestamp, raw_text\r\nFROM `io-2022-keynote-demo.mobile_feedback.tagged_feedback`\r\nWHERE category="bug report" \r\n"""\r\nquery = bq.query(sql)\r\n \r\nfor row in query.result():\r\n issue_body = row.get("raw_text")\r\n issue_timestamp = row.get("timestamp")\r\n create_issue(issue_body, issue_timestamp)'), (u'language', u'')])]

    To handle secrets, like our GitHub token we used secrets manager and then we loaded the secrets into variables with code like this: 

    code_block
    [StructValue([(u'code', u'SECRET_NAME = "github-token"\r\nSECRET_ID = f"projects/{PROJECT_NUMBER}/secrets/{SECRET_NAME}/versions/2"\r\nGITHUB_TOKEN = secretmanager.SecretManagerServiceClient().access_secret_version(name=SECRET_ID).payload.data.decode()'), (u'language', u'')])]

    Hooking up to CRM or a support ticket database is similar and lets us channel any support requests or pricing issues to the customer success team. We can schedule the jobs to run when we want and as often as we want using Cloud Scheduler. Since we didn't want to constantly create new bugs, we've set the job creating GitHub issues to run once a day using this configuration in cron notation: "0 1 * * *".

    Opportunities Near A Location  

    The second feature we showed in the Cloud Keynote would allow users to see opportunities near a specific location. To do this we utilized the GIS features built into Postgres, so we used Cloud SQL for PostgreSQL. To query the Postgres database we used a Cloud Run service that our mobile app called as needed. 

    At a certain point we outgrew the PostgreSQL on Cloud SQL solution, as it was too slow. We tried limiting the number of responses we returned, but that wasn't a great user experience. We needed something that was able to handle a large amount of GIS data in near real time. 

    AlloyDB excels in situations like this where you need high throughput and real time performance on large amounts of data. Luckily, since AlloyDB is Postgres compatible it is a drop in replacement in our Cloud Run Service, we simply needed to migrate the data from Cloud SQL and change the connection string our Cloud Run Service was using. 

    Conclusion 

    So that's a deeper dive into one of our I/O demos and the products Google Cloud launched at Google I/O this year. Please come visit us in adventure and check out the codelabs and technical sessions at https://io.google/2022/.

  • What’s new with Google Cloud Wed, 11 May 2022 21:00:00 -0000

    Want to know the latest from Google Cloud? Find it here in one handy location. Check back regularly for our newest updates, announcements, resources, events, learning opportunities, and more. 


    Tip: Not sure where to find what you’re looking for on the Google Cloud blog? Start here: Google Cloud blog 101: Full list of topics, links, and resources.


    Week of May 9 - May 13, 2022

    • We just published a blog post announcing the latest Google Cloud’s STAC-M3™ benchmark results. Following up on our 2018 STAC-M3 benchmark audit, a redesigned Google Cloud architecture achieved significant improvements: Up to 18x faster, Up to 9x higher throughput, and new record in STAC-M3.ß1.1T.YRHIBID-2.TIME. We also published a whitepaper on how we designed and optimized the cluster for API-driven cloud resources.
    • Security Command Center (SCC) released new finding types that alert customers when SCC is either misconfigured or configured in a way that prevents it from operating as expected. These findings provide remediation steps to return SCC to an operational state. Learn more and see examples.

    Week of May 2 - May 6, 2022

    • As part of Anthos release 1.11, Anthos Clusters on Azure and Anthos Clusters on AWS now support Kubernetes versions 1.22.8-gke.200 and 1.21.11-gke.100. As a preview feature, you can now choose Windows as your node pool image type when you create node pools with Kubernetes version 1.22.8. For more information, check out the Anthos multi cloud website.
    • The Google Cloud Future of Data whitepaper explores why the future of data will involve three key themes: unified, flexible, and accessible.
    • Learn about BigQuery BI Engine and how to analyze large and complex datasets interactively with sub-second query response time and high concurrency. Now generally available.
    • Announcing the launch of the second series of the Google Cloud Technical Guides for Startups, a video series for technical enablement aimed at helping startups to start, build and grow their businesses.
    • Solving for food waste with data analytics in Google Cloud. Explore why it is so necessary as a retailer to bring your data to the cloud to apply analytics to minimize food waste.
    • Mosquitoes get the swat with new Mosquito Forecast built by OFF! Insect Repellents and Google Cloud. Read how SC Johnson built an app that predicts mosquito outbreaks in your area.

    Week of April 25 - April 29, 2022

    Week of April 18 - April 22, 2022 

    Week of April 11 - April 15, 2022 

    • Machine learning company Moloco uses Cloud Bigtable to process 5+ million ad bid requests per second. Learn how Moloco uses Bigtable to keep up in a speedy market and process ad requests at unmatched speed and scale.
    • The Broad Institute of MIT and Harvard speeds scientific research with Cloud SQL. One of our customers, the Broad Institute, shares how they used Cloud SQL to accelerate scientific research. In this customer story, you will learn how the Broad Institute was able to get Google’s database services up and running quickly and lower their operational burden by using Cloud SQL.
    • Data Cloud Summit ‘22 recap blog on April 12: Didn’t get a chance to watch the Google Data Cloud Summit this year? Check out our recap to learn the top five takeaways - learn more about product announcements, customer speakers, partners, product demos and check out more resources on your favorite topics.
    • The new Professional Cloud Database Engineer certification in beta is here. By participating in this beta, you will directly influence and enhance the learning and career path for Cloud Database Engineers globally. Learn more and sign up today.
    • Learn how to use Kubernetes Jobs and cost-optimized Spot VMs to run and manage fault-tolerant AI/ML batch workloads on Google Kubernetes Engine.
    • Expanding Eventarc presence to 4 new regions—asia-south2, australia-southeast2, northamerica-northeast2, southamerica-west1. You can now create Eventarc resources in 30 regions.

    Week of April 4 - April 8, 2022 

    • Join us at the Google Data Cloud Summit on Wednesday, April 6, at 9 AM PDT.  Learn how Google Cloud technologies across AI, machine learning, analytics, and databases have helped organizations such as Exabeam, Deutsche Bank, and PayPal to break down silos, increase agility, derive more value from data, and innovate faster. Register today for this no cost digital event.
    • Announcing the first Data Partner Spotlight, on May 11th 
      We saved you a seat at the table to learn about the Data Cloud Partners in the Google Cloud ecosystem. We will spotlight technology partners, and deep dive into their solutions, so business leaders can make smarter decisions, and solve complex data challenges with Google Cloud. Register today for this digital event
    • Introducing Vertex AI Model Registry, a central repository to manage and govern the lifecycle of your ML models. Designed to work with any type of model and deployment target, including BigQuery ML, Vertex AI Model Registry makes it easy to manage and deploy models. Learn more about Google’s unified data and AI offering.
    • Vertex AI Workbenchis now GA, bringing together Google Cloud’s data and ML systems into a single interface so that teams have a common toolset across data analytics, data science, and machine learning. With native integrations across BigQuery, Spark, Dataproc, and Dataplex data scientists can build, train and deploy ML models 5X faster than traditional notebooks. Don’t miss this ‘How to’ session from the Data Cloud Summit.

    Week of Mar 28 - April 1, 2022

    • Learn how Google Cloud’s network and Network Connectivity Center can transform the private wires used for voice trading.
    • Anthos bare metal 1.11 minor release is available now. Containerd is the default runtime in Anthos clusters on bare metal in this release.  Examples of the feature enhancements are as below:
        • Upgraded Anthos clusters on bare metal to use Kubernetes version 1.22;

        • AddedEgress Network Address Translation (NAT) gateway capability to provide persistent, deterministic routing for egress traffic from clusters

        • Enabled IPv4/IPv6 dual-stack support

        • Additional enhancements in the release can be found in the the release note  here

    Week of Mar 21 - Mar 25, 2022

    • Google Cloud’s Behnaz Kibria reflects on a recent fireside chat that she moderated with Google Cloud’s Phil Moyer and former SEC Commissioner, Troy Paredes at FIA Boca. The discussion focused on the future of markets and policy, the new technologies that are already paving the way for greater speed and transparency, and what it will take to ensure greater resiliency, performance and security over the longer term. Read the blog.
    • Eventarc adds support for Firebase Alerts. Now you can create Eventarc triggers to send Firebase Alerts events to your favorite destinations that Eventarc supports.
    • Now you can control how your alerts handle missing data from telemetry data streams using Alert Policies in the Cloud Console or via API. In cloud ecosystems there are millions of data sources, and often, there are pauses or breaks in their telemetry data streams. Configure how this missing data influences your open incidents:

      • Option 1: Missing data is treated as “above the threshold”- and your incidents will stay open.

      • Option 2: missing data is evaluated as “below the threshold” and the incident will close after your retest window period.

    Week of Mar 14 - Mar 18, 2022

    • Natural language processing is a critical AI tool for understanding unstructured, often technical healthcare information, like clinical notes and lab reports. See how leading healthcare organizations are exploring NLP to unlock hidden value in their data.
    • A handheld lab: Read how Cue Health is revolutionizing healthcare diagnostics for COVID-19 and beyond—all from the comfort of home.
    • Providing reliable technical support for an increasingly distributed, hybrid workforce is becoming all the more crucial, and challenging. Cloud Customer Care has added a range of new offerings and features for businesses of all sizes to help you find the Google Cloud technical support services that are best for your needs and budget.
    • #GoogleforGames Dev Summit is NOW LIVE. Watch the keynote followed by over 20 product sessions on-demand to help you build high quality games and reach audiences around the world. Watch → g.co/gamedevsummit
    • Meeting (and ideally, exceeding) consumer expectations today is often a heavy lift for many companies—especially those running modern apps on legacy, on-premises databases. Read how Google Cloud database services provide you the best options for industry-leading reliability, global scale & open standards, enabling you to make your next big idea a reality. Read this blog.

    Week of Mar 07 - Mar 11, 2022

    • Learn how Google Cloud Partner Advantage partners help customers solve real-world business challenges in retail and ecommerce through data insights.
    • Introducing Community Security Analytics, an open-source repository of queries for self-service security analytics. Get started analyzing your own Google Cloud logs with BigQuery or Chronicle to detect potential threats to your workloads, and to audit usage of your data. Learn more.
    • On a mission to accelerate the world's adoption of a modern approach to threat management through Autonomic Security Operations, our latest update expands our ASO technology stack with Siemplify, offers a solution to the latest White House Executive Order 14028, introduces a community-based security analytics repository, and announces key R&D initiatives that we’re investing in to bolster threat-informed defenses worldwide. Read more here
    • Account defender, available today in public preview, is a feature in reCAPTCHA Enterprise that takes behavioral detection a step further. It analyzes the patterns of behavior for an individual account, in addition to the patterns of behavior of all user accounts associated with your website. Read more here.
    • Maximize your Cloud Spanner savings with new committed use discounts. Get up to 40% discount on Spanner compute capacity by purchasing committed use discounts. Once you make a commitment to spend a certain amount on an hourly basis on Spanner from a billing account, you can get discounts on instances in different instance configurations, regions, and projects associated with that billing account. This flexibility helps you achieve a high utilization rate of your commitment across regions and projects without manual intervention, saving you time and money. Learn more. 
    • In many places across the globe, March is celebrated as Women’s History Month, and March 8th, specifically, marks the day known around the world as International Women’s Day. Google Cloud, in partnership with Women Techmakers, has created an opportunity to bridge the gaps in the credentialing space by offering a certification journey for Ambassadors of the Women Techmakers community. Learn more.
    • Learn how to accelerate vendor due diligence on Google Cloud by leveraging third party risk management providers.
    • Hybrid work should not derail DEI efforts. If you’re moving to a hybrid work model, here’s how to make diversity, equity and inclusion central to it.
    • Learn how Cloud Data Fusion provides scalable data integration pipelines to help consolidate a customer’s SAP and non-SAP datasets within BigQuery.
    • Hong Kong–based startup TecPal builds and manages smart hardware and software for household appliances all over the world using Google Cloud. Find out how.
    • Eventarc adds support for Firebase Remote Config and Test Lab in preview. Now you can create Eventarc triggers to send Firebase Remote Config or Firebase Test Lab events to your favorite destinations that Eventarc supports. 
    • Anthos Service Mesh Dashboard is now available (public preview) on the Anthos clusters on Bare Metaland Anthos clusters on VMware . Customers can now get out-of-the-box telemetry dashboards to see a services-first view of their application on the Cloud Console.
    • Micro Focus Enterprise Server Google Cloud blueprint performs an automated deployment of Enterprise Server inside a new VPC or existing VPC. Learn more.
    • Learn how to wire your application logs with more information without adding a single line of code and get more insights with the new version of the Java library.
    • Pacemaker Alerts in Google Cloudcluster alerting enables the system administrator to be notified about critical events of the enterprise workloads in GCP like the SAP solutions.

    Week of Feb 28 - Mar 04, 2022

    • Announcing the Data Cloud Summit, April 6th!—Ready to dive deep into data? Join us at the Google Data Cloud Summit on Wednesday, April 6, at 9 AM PDT. This three-hour digital event is packed with content and experiences designed to help you unlock innovation in your organization. Learn how Google Cloud technologies across AI, machine learning, analytics, and databases have helped organizations such as Exabeam, Deutsche Bank, and PayPal to break down silos, increase agility, derive more value from data, and innovate faster. Register today for this no cost digital event.
    • Google Cloud addresses concerns about how its customers might be impacted by the invasion of Ukraine. Read more.
    • Eventarc is now HIPAA compliant— Eventarc is covered under the Google Cloud Business Associate Agreement (BAA), meaning it has achieved HIPAA compliance. Healthcare and life sciences organizations can now use Eventarc to send events that require HIPAA compliance.
    • Eventarc trigger for Workflows is now available in Preview. You can now select Workflows as a destination to events originating from any supported event provider
    • Error Reporting automatically captures exceptions found in logs ingested by Cloud Logging from the following languages: Go, Java, Node.js, PHP, Python, Ruby, and .NET, aggregates them, and then notifies you of their existence.
    • Learn moreabout how USAA partnered with Google Cloud to transform their operations by leveraging AI to drive efficiency in vehicle insurance claims estimation.
    • Learn how Google Cloud and NetApp’s ability to “burst to cloud”, seamlessly spinning up compute and storage on demand accelerates EDA design testing.
    • Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.
    • Google Cloud Easy as Pie Hackathon, the results are in.
    • VPC Flow Logs Org Policy Constraints allow users to enforce VPC Flow Logs enablement across their organization, and impose minimum and maximum sampling rates. VPC Flow Logs are used to understand network traffic for troubleshooting, optimization and compliance purposes.
    • Google Cloud Managed Service for Prometheus is now generally available. Get all of the benefits of open source-compatible monitoring with the ease of use of Google-scale managed services. 
    • Google Cloud Deploy now supports Anthos clusters bringing opinionated, fully managed continuous delivery for hybrid and multicloud workloads. Cloud Deploy provides integrated best practices, security, and metrics from a centralized control plane.
    • Learn Google Workspace’s vision for frontline workers and how our Frontline solution innovations can bridge collaboration and productivity across workforce in-office and remote.

    Week of Feb 21 - Feb 25, 2022

    • Read how Paerpay promotes bigger tabs and faster, more pleasant transactions with Google Cloud  and the Google for Startups Cloud Program.
    • Learn about the advancements we’ve released for our Google Cloud Marketplace customers and partners in the last few months.
    • BBVA collaborated with Google Cloud to create one of the most successful Google Cloud training programs for employees to date. Read how they did it
    • Google for Games Developer Summit returns March 15 at 9AM PT! Learn about our latest games solutions and product innovations. It’s online and open to all. Check out the full agenda g.co/gamedevsummit 
    • Build a data mesh on Google Cloud with Dataplex (now GA 🎉). Read how Dataplex enables customers to centrally manage, monitor, and govern distributed data, and makes it securely accessible to a variety of analytics and data science tools.
    • While understanding what is happening now has great business value, forward-thinking companies like Tyson Foods are taking things a step further, using real-time analytics integrated with artificial intelligence (AI) and business intelligence (BI) to answer the question, “what might happen in the future?
    • Join us for the first Google Cloud Security Talks of 2022, happening on March 9th. Modernizing SecOps is a top priority for so many organizations. Register to attend and learn how you can enhance your approach to threat detection, investigation and response!
    • Google Cloud introduces their Data Hero series with a profile on Lynn Langit, a data cloud architect, educator, and developer on GCP.
    • Building ML solutions? Check out these guidelines for ensuring quality in each process of the MLOps lifecycle.
    • Eventarc is now Payment Card Industry Data Security Standard (PCI DSS)-compliant.

    Week of Feb 14 - Feb 18, 2022

    • The Google Cloud Retail Digital Pulse-Asia Pacificis an ongoing annual assessment carried out in partnership with IDC Retail Insights to understand the maturity of retail digital transformation in the Asia Pacific Region. The study covers 1304 retailers across eight markets & sub-segments to investigate their digital maturity across five dimensions - strategy, people, data , technology and process to arrive at a 4-stage Digital Pulse Index, with 4 being the most mature. It provides great insights in various stages of digital maturity of asian retailers, their drivers for digitisation, challenges, innovation hotspots and the focus areas with respect to use cases and technologies.
    • Deploying Cloud Memorystore for Redis for any scale: Learn how you can scale Cloud Memorystore for high volume use cases by leveraging client-side sharding. This blog provides a step by step walkthrough which demonstrates how you can adapt your existing application to scale to the highest levels with the help of the Envoy Proxy. Read our blog to learn more.
    • Check out how six SAP customers are driving value with BigQuery.
    • This Black History Month, we're highlighting Black-led startups using Google Cloud to grow their businesses. Check out how DOSS and its co-founder, Bobby Bryant, disrupts the real estate industry with voice search tech and analytics on Google Cloud.
    • Vimeo leverages managed database services from Google Cloud to serve up billions of views around the world each day. Read how it uses Cloud Spanner to deliver a consistent and reliable experience to its users no matter where they are.
    • How can serverless best be leveraged? Can cloud credits be maximized? Are all managed services equal? We dive into top questions for startups.
    • Google introduces Sustainability value pillar in GCP Active Assist solutionto accelerate our industry leadership in Co2 reduction and environmental protection efforts. Intelligent carbon footprint reduction tool is launched in preview.
    • Central States health insurance CIO Pat Moroney shares highs and lows from his career transforming IT. Read more
    • Traffic Director client authorization for proxyless gRPC services is now generally available. Combine with managed mTLS credentials in GKE to centrally manage access between workloads using Traffic Director. Read more.
    • Cloud Functions (2nd gen) is now in public preview. The next generation of our Cloud Functions Functions-as-a-Service platform gives you more features, control, performance, scalability and events sources. Learn more.

    Week of Feb 7 - Feb 11, 2022

    • Now announcing the general availability of the newest instance series in our Compute Optimized family, C2D—powered by 3rd Gen AMD EPYC processors. Read how C2D provides larger instance types, and memory per core configurations ideal for customers with performance-intensive workloads.
    • Digital health startup expands its impact on healthcare equity and diversity with Google Cloud Platform and the Google for Startups Accelerator for Black Founders. Rear more.
    • Storage Transfer Service support for agent pools is now generally available (GA) . You can use agent pools to create isolated groups of agents as a source or sink entity in a transfer job. This enables you to transfer data from multiple data centers and filesystems concurrently, without creating multiple projects for a large transfer spanning multiple filesystems and data centers. This option is available via API, Console, and gcloud transfer CLI.
    • The five trends driving healthcare and life sciences in 2022 will be powered by accessible data, AI, and partnerships.
    • Learn how COLOPL, Minna Bank and 7-Eleven Japan use Cloud Spanner to solve their scalability, performance and digital transformation challenges.

    Week of Jan 31 - Feb 4, 2022

    • Pub/Sub Lite goes regional. Pub/Sub Lite is a high-volume messaging service with ultra-low cost that now offers regional Lite topics, in addition to existing zonal Lite topics. Unlike zonal topics which are located in a single zone, regional topics are asynchronously replicated across two zones. Multi-zone replication protects from zonal failures in the service. Read about it here.

    • Google Workspace is making it easy for employees to bring modern collaboration to work, even if their organizations are still using legacy tools. Essentials Starter is a no-cost offer designed to help people bring the apps they know and love to use in their personal lives to their work life. Learn more.

    • We’re now offering 30 days free access to role-based Google Cloud training with interactive labs and opportunities to earn skill badges to demonstrate your cloud knowledge. Learn more.

    • Security Command Center (SCC) Premium adds support for additional compliance benchmarks, including CIS Google Cloud Computing Foundations 1.2 and OWASP Top 10 2017 & 2021. Learn more about how SCC helps manage and improve your cloud security posture.

    • Storage Transfer Service now offers Preview support transfers from self-managed object storage systems via user-managed agents. With this new feature, customers can seamlessly copy PBs of data from cloud or on-premise object storage to Google Cloud Storage. Object Storage sources must be compatible with Amazon S3 APIs. For customers migrating from AWS S3 to GCS, this feature gives an option to control network routes to Google Cloud. Fill this signup form to access this STS feature.

    Week of Jan 24-Jan 28, 2022

    • Learn how Sabre leveraged a 10-year partnership with Google Cloud to power the travel industry with innovative technology. As Sabre embarked on a cloud transformation, it sought managed database services from Google Cloud that enabled low latency and improved consistency. Sabre discovered how the strengths of both Cloud Spanner and Bigtable supported unique use cases and led to high performance solutions.

    • Storage Transfer Service now offers Preview support for moving data between two filesystems and keeping them in sync on a periodic schedule. This launch offers a managed way to migrate from a self-managed filesystem to Filestore. If you have on-premises systems generating massive amounts of data that needs to be processed in Google Cloud, you can now use Storage Transfer Service to accelerate data transfer from an on-prem filesystem to a cloud filesystem. See Transfer data between POSIX file systems for details.
    • Storage Transfer Service now offers Preview support for preserving POSIX attributes and symlinks when transferring to, from, and between POSIX filesystems. Attributes include the user ID of the owner, the group ID of the owning group, the mode or permissions, the modification time, and the size of the file. See Metadata preservation for details.
    • Bigtable Autoscaling is Generally Available (GA): Bigtable Autoscaling automatically adds or removes capacity in response to the changing demand for your applications. With autoscaling, you only pay for what you need and you can spend more time on your business instead of managing infrastructure.  Learn more.

    Week of Jan 17-Jan 21, 2022

    • Sprinklr and Google Cloud join forces to help enterprises reimagine their customer experience management strategies. Hear more from Nirav Sheth, Nirav Sheth, Director of ISV/Marketplace & Partner Sales.
    • Firestore Key Visualizer is Generally Available (GA): Firestore Key Visualizer is an interactive, performance monitoring tool that helps customers observe and maximize Firestore’s  performance. Learn more.
    • Like many organizations, Wayfair faced the challenge of deciding which cloud databases they should migrate to in order to modernize their business and operations. Ultimately, they chose Cloud SQL and Cloud Spanner because of the databases’ clear path for shifting workloads as well as the flexibility they both provide. Learn how Wayfair was able to migrate quickly while still being able to serve production traffic at scale.

    Week of Jan 10-Jan 14, 2022

    • Start your 2022 New Year’s resolutions by learning at no cost how to use Google Cloud. Read more to find how to take advantage of these training opportunities.
    • 8 megatrends drive cloud adoption—and improve security for all. Google Cloud CISO Phil Venables explains the eight major megatrends powering cloud adoption, and why they’ll continue to make the cloud more secure than on-prem for the foreseeable future. Read more.

    Week of Jan 3-Jan 7, 2022

    • Google Transfer Appliance announces General Availability of online mode. Customers collecting data at edge locations (e.g. cameras, cars, sensors) can offload to Transfer Appliance and stream that data to a Cloud Storage bucket. Online mode can be toggled to send the data to Cloud Storage over the network, or offline by shipping the appliance. Customers can monitor their online transfers for appliances from Cloud Console.

    Week of Dec 27-Dec 31, 2021

    • The most-read blogs about Google Cloud compute, networking, storage and physical infrastructure in 2021. Read more.

    • Top Google Cloud managed container blogs of 2021.

    • Four cloud security trends that organizations and practitioners should be planning for in 2022—and what they should do about them. Read more.

    • Google Cloud announces the top data analytics stories from 2021 including the top three trends and lessons they learned from customers this year. Read more.

    • Explore Google Cloud’s Contact Center AI (CCAI) and its momentum in 2021. Read more.

    • An overview of the innovations that Google Workspace delivered in 2021 for Google Meet. Read more.

    • Google Cloud’s top artificial intelligence and machine learning posts from 2021. Read more.

    • How we’ve helped break down silos, unearth the value of data, and apply that data to solve big problems. Read more.

    • A recap of the year’s infrastructure progress, from impressive Tau VMs, to industry-leading storage capabilities, to major networking leaps. Read more.

    • Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team. Read more.

    • Google Cloud - A cloud built for developers — 2021 year in review. Read more.

    • API management continued to grow in importance in 2021, and Apigee continued to innovate capabilities for customers, new solutions, and partnerships. Read more.

    • Recapping Google’s progress in 2021 toward running on 24/7 carbon-free energy by 2030 — and decarbonizing the electricity system as a whole. Read more.

    Week of Dec 20-Dec 24, 2021

    • And that’s a wrap! After engaging in countless customer interviews, we’re sharing our top 3 lessons learned from our data customers in 2021. Learn what customer data journeys inspired our top picks and what made the cut here.
    • Cloud SQL now shows you minor version information. For more information, see our documentation.
    • Cloud SQL for MySQL now allows you to select your MySQL 8.0 minor version when creating an instance and upgrade MySQL 8.0 minor version. For more information, see our documentation.
    • Cloud SQL for MySQL now supports database auditing. Database auditing lets you track specific user actions in the database, such as table updates, read queries, user privilege grants, and others. To learn more, see MySQL database auditing.

    Week of Dec 12-Dec 17, 2021

    • A CRITICAL VULNERABILITY in a widely used logging library, Apache’s Log4j, has become a global security incident. Security researchers around the globe warn that this could have serious repercussions. Two Google Cloud Blog posts describe how Cloud Armorand Cloud IDS both help mitigate the threat.
    • Take advantage of these ten no-cost trainings before 2022. Check them out here.
    • Deploy Task Queues alongside your Cloud Application: Cloud Tasks is now available in 23 GCP Regions worldwide. Read more.
    • Managed Anthos Service Mesh support for GKE Autopilot (Preview): GKE Autopilot with Managed ASM provides ease of use and simplified administration capabilities, allowing customers to focus on their application, not the infrastructure. Customers can now let Google handle the upgrade and lifecycle tasks for both the cluster and the service mesh. Configure Managed ASM with asmcli experiment in GKE Autopilot cluster.
    • Policy Troubleshooter for BeyondCorp Enterprise is now generally available! Using this feature, admins can triage access failure events and perform the necessary actions to unblock users quickly. Learn more by registering for Google Cloud Security Talks on December 15 and attending the BeyondCorp Enterprise session. The event is free to attend and sessions will be available on-demand.
    • Google Cloud Security Talks, Zero Trust Edition: This week, we hosted our final Google Cloud Security Talks event of the year, focused on all things zero trust. Google pioneered the implementation of zero trust in the enterprise over a decade ago with our BeyondCorp effort, and we continue to lead the way, applying this approach to most aspects of our operations. Check out our digital sessions on-demand to hear the latest updates on Google’s vision for a zero trust future and how you can leverage our capabilities to protect your organization in today’s challenging threat environment.

    Week of Dec 6-Dec 10, 2021

    • 5 key metrics to measure cloud FinOps impact in 2022 and beyond - Learn about the 5 key metrics to effectively measure the impact of Cloud FinOps across your organization and leverage the metrics to gain insights, prioritize on strategic goals, and drive enterprise-wide adoption. Learn more
    • We announced Cloud IDS, our new network security offering, is now generally available. Cloud IDS, built with Palo Alto Networks’ technologies, delivers easy-to-use, cloud-native, managed, network-based threat detection with  industry-leading breadth and security efficacy. To learn more, and request a 30 day trial credit, see the Cloud IDS webpage.

    Week of Nov 29-Dec 3, 2021

    • Join Cloud Learn, happening from Dec. 8-9: This interactive learning event will have live technical demos, Q&As, career development workshops, and more covering everything from Google Cloud fundamentals to certification prep. Learn more.

    • Get a deep dive into BigQuery Administrator Hub– With BigQuery Administrator Hub you can better manage BigQuery at scale with Resource Charts and Slot Estimator Administrators. Learn more about these tools and just how easy they are to usehere.

    • New data and AI in Media blog - How data and AI can help media companies better personalize; and what to watch out for. We interviewed Googlers, Gloria Lee, Executive Account Director of Media & Entertainment, and John Abel, Technical Director for the Office of the CTO, to share exclusive insights on how media organizations should think about and ways to make the most out of their data in the new era of direct-to-consumer. Watch our video interview with Gloria and John and read more.

    • Datastream is now generally available (GA): Datastream, a serverless change data capture (CDC) and replication service, allows you to synchronize data across heterogeneous databases, storage systems, and applications reliably and with minimal latency to support real-time analytics, database replication, and event-driven architectures. Datastream currently supports CDC ingestion from Oracle and MySQL to Cloud Storage, with additional sources and destinations coming in the future. Datastream integrates with Dataflow and Cloud Data Fusion to deliver real time replication to a wide range of destinations, including BigQuery, Cloud Spanner and Cloud SQL. Learn more.

    Week of Nov 22 - Nov 26, 2021

    • Security Command Center (SCC) launches new mute findings capability: We’re excited to announce a new “Mute Findings” capability in SCC that helps you gain operational efficiencies by effectively managing the findings volume based on your organization’s policies and requirements. SCC presents potential security risks in your cloud environment as ‘findings’ across misconfigurations, vulnerabilities, and threats. With the launch of ‘mute findings’ capability, you gain a way to reduce findings volume and focus on the security issues that are highly relevant to you and your organization. To learn more, read this blog post and watch thisshort demo video.

    Week of Nov 15 - Nov 19, 2021

    • Cloud Spanner is our distributed, globally scalable SQL database service that decouples compute from storage, which makes it possible to scale processing resources separately from storage. This means that horizontal upscaling is possible with no downtime for achieving higher performance on dimensions such as operations per second for both reads and writes. The distributed scaling nature of Spanner’s architecture makes it an ideal solution for unpredictable workloads such as online games. Learn how you can get started developing global multiplayer games using Spanner.

    • New Dataflow templates for Elasticsearch releasedto help customers process and export Google Cloud data into their Elastic Cloud. You can now push data from Pub/Sub, Cloud Storage or BigQuery into your Elasticsearch deployments in a cloud-native fashion. Read more for a deep dive on how to set up a Dataflow streaming pipeline to collect and export your Cloud Audit logs into Elasticsearch, and analyze them in Kibana UI.

    • We’re excited to announce the public preview of Google Cloud Managed Service for Prometheus, a new monitoring offering designed for scale and ease of use that maintains compatibility with the open-source Prometheus ecosystem. While Prometheus works well for many basic deployments, managing Prometheus can become challenging at enterprise scale. Learn more about the service in our blog and on the website.

    Week of Nov 8 - Nov 12, 2021

    Week of Nov 1 - Nov 5, 2021

    • Time to live (TTL) reduces storage costs, improves query performance, and simplifies data retention in Cloud Spanner by automatically removing unneeded data based on user-defined policies. Unlike custom scripts or application code, TTL is fully managed and designed for minimal impact on other workloads. TTL is generally available today in Spanner at no additional cost. Read more.
    • New whitepaper available: Migrating to .NET Core/5+ on Google Cloud - This free whitepaper, written for .NET developers and software architects who want to modernize their .NET Framework applications, outlines the benefits and things to consider when migrating .NET Framework apps to .NET Core/5+ running on Google Cloud. It also offers a framework with suggestions to help you build a strategy for migrating to a fully managed Kubernetes offering or to Google serverless. Download the free whitepaper.
    • Export from Google Cloud Storage: Storage Transfer Service now offers Preview support for exporting data from Cloud Storage to any POSIX file system. You can use this bidirectional data movement capability to move data in and out of Cloud Storage, on-premises clusters, and edge locations including Google Distributed Cloud. The service provides built-in capabilities such as scheduling, bandwidth management, retries, and data integrity checks that simplifies the data transfer workflow. For more information, see Download data from Cloud Storage.
    • Document Translation is now GA! Translate documents in real-time in 100+ languages, and retain document formatting. Learn more about new features and see a demo on how Eli Lilly translates content globally.
    • Announcing the general availability of Cloud Asset Inventory console - We’re excited to announce the general availability of the new Cloud Asset Inventory user interface. In addition to all the capabilities announced earlier in Public Preview, the general availability release provides powerful search and easy filtering capabilities. These capabilities enable you to view details of resources and IAM policies, machine type and policy statistics, and insights into your overall cloud footprint. Learn more about these new capabilities by using the searching resources and searching IAM policies guides. You can get more information about Cloud Asset Inventory using our product documentation.

    Week of Oct 25 - Oct 29, 2021

    • BigQuery table snapshots are now generally available. A table snapshot is a low-cost, read-only copy of a table's data as it was at a particular time.
    • By establishing a robust value measurement approach to track and monitor the business value metrics toward business goals, we are bringing technology, finance, and business leaders together through the discipline of Cloud FinOps to show how digital transformation is enabling the organization to create new innovative capabilities and generate top-line revenue. Learn more.
    • We’ve announced BigQuery Omni, a new multicloud analytics service that allows data teams to perform cross-cloud analytics - across AWS, Azure, and Google Cloud - all from one viewpoint. Learn how BigQuery Omni works and what data and business challenges it solves here.

    Week of Oct 18 - Oct 22, 2021

    • Available now are our newest T2D VMs family based on 3rd Generation AMD EPYC processors. Learn more.
    • In case you missed it — top AI announcements from Google Cloud Next. Catch up on what’s new, see demos, and hear from our customers about how Google Cloud is making AI more accessible, more focused on business outcomes, and fast-tracking the time-to-value.
    • Too much to take in at Google Cloud Next 2021? No worries - here’s a breakdown of the biggest announcements at the 3-day event.
    • Check out the second revision of Architecture Framework, Google Cloud’s collection of canonical best practices.

    Week of Oct 4 - Oct 8, 2021

    • We’re excited to announce Google Cloud’s new goal of equipping more than 40 million people with Google Cloud skills. To help achieve this goal, we’re offering no-cost access to all our training content this month. Find out more here
    • Support for language repositories in Artifact Registry is now generally available. Artifact Registry allows you to store all your language-specific artifacts in one place. Supported package types include Java, Node and Python. Additionally, support for Linux packages is in public preview. Learn more.
    • Want to know what’s the latest with Google ML-Powered intelligence service Active Assist and how to learn more about it at Next’21? Check out this blog.

    Week of Sept 27 - Oct 1, 2021

    • Announcing the launch of Speaker ID. In 2020, customer preference for voice calls increased by 10 percentage points (to 43%) and was by far the most preferred service channel. But most callers still need to pass through archaic authentication processes which slows down the time to resolution and burns through valuable agent time. Speaker ID, from Google Cloud, brings ML-based speaker identification directly to customers and contact center partners, allowing callers to authenticate over the phone, using their own voice. Learn more.
    • Your guide to all things AI & ML at Google Cloud Next. Google Cloud Next is coming October 12–14 and if you’re interested in AI & ML, we’ve got you covered. Tune in to hear about real use cases from companies like Twitter, Eli Lilly, Wayfair, and more. We’re also excited to share exciting product news and hands on AI learning opportunities. Learn more about AI at Next and register for free today!
    • It is now simple to use Terraform to configure Anthos features on your GKE clusters. Check out part two of this series which explores adding Policy Controller audits to our Config Sync managed cluster. Learn more.

    Week of Sept 20 - Sept 24, 2021

    • Announcing the webinar, Powering market data through cloud and AI/ML. We’re sponsoring a Coalition Greenwich webinar on September 23rd where we’ll discuss the findings of our upcoming study on how market data delivery and consumption is being transformed by cloud and AI. Moderated by Coalition Greenwich, the panel will feature Trey Berre from CME Group, Brad Levy from Symphony, and Ulku Rowe representing Google Cloud. Register here.
    • New research from Google Cloud reveals five innovation trends for market data. Together with Coalition Greenwich we surveyed exchanges, trading systems, data aggregators, data producers, asset managers, hedge funds, and investment banks to examine both the distribution and consumption of market data and trading infrastructure in the cloud. Learn more about our findings here.
    • If you are looking for a more automated way to manage quotas over a high number of projects, we are excited to introduce a Quota Monitoring Solution from Google Cloud Professional Services. This solution benefits customers who have many projects or organizations and are looking for an easy way to monitor the quota usage in a single dashboard and use default alerting capabilities across all quotas.

      Week of Sept 13 - Sept 17, 2021

      • New storage features help ensure data is never lost. We are announcing extensions to our popular Cloud Storage offering, and introducing two new services, Filestore Enterprise, and Backup for Google Kubernetes Engine (GKE). Together, these new capabilities will make it easier for you to protect your data out-of-the box, across a wide variety of applications and use cases: Read the full article.
      • API management powers sustainable resource management. Water, waste, and energy solutions company, Veolia, uses APIs and API Management platform Apigee to build apps and help their customers build their own apps, too. Learn from their digital and API-first approach here.
      • To support our expanding customer base in Canada, we’re excited to announce that the new Google Cloud Platform region in Toronto is now open. Toronto is the 28th Google Cloud region connected via our high-performance network, helping customers better serve their users and customers throughout the globe. In combination with Montreal, customers now benefit from improved business continuity planning with distributed, secure infrastructure needed to meet IT and business requirements for disaster recovery, while maintaining data sovereignty.
      • Cloud SQL now supports custom formatting controls for CSVs.When performing admin exports and imports, users can now select custom characters for field delimiters, quotes, escapes, and other characters. For more information, see our documentation.

      Week of Sept 6 -Sept 10, 2021

      • Hear how Lowe’s SRE was able to reduce their Mean Time to Recovery (MTTR) by over 80% after adopting Google’s Site Reliability Engineering practices and Google Cloud’s operations suite.

      Week of  Aug 30 -Sept 3, 2021

      • A what’s new blog in the what’s new blog? Yes, you read that correctly. Google Cloud data engineers are always hard at work maintaining the hundreds of dataset pipelines that feed into our public datasets repository, but they’re also regularly bringing new ones into the mix. Check out our newest featured datasets and catch a few best practices in our living blog: What are the newest datasets in Google Cloud?
      • Migration success with Operational Health Reviews from Google Cloud’s Professional Service Organization - Learn how Google Cloud’s Professional Services Org is proactively and strategically guiding customers to operate effectively and efficiently in the Cloud, both during and after their migration process.
      • Learn how we simplified monitoring for Google Cloud VMware Engine and Google Cloud operations suite. Read more.

      Week of Aug 23 -Aug 27, 2021

      • Google Transfer Appliance announces preview of online mode. Customers are increasingly collecting data that needs to quickly be transferred to the cloud. Transfer Appliances are being used to quickly offload data from sources (e.g. cameras, cars, sensors) and can now stream that data to a Cloud Storage bucket. Online mode can be toggled as data is copied into the appliance and either send the data offline by shipping the appliance to Google or copy data to Cloud Storage over the network. Read more.
      • Topic retention for Cloud Pub/Sub is now Generally Available. Topic retention is the most comprehensive and flexible way available to retain Pub/Sub messages for message replay. In addition to backing up all subscriptions connected to the topic, new subscriptions can now be initialized from a timestamp in the past. Learn more about the feature here.
      • Vertex Predictions now supports private endpoints for online prediction. Through VPC Peering, Private Endpoints provide increased security and lower latency when serving ML models. Read more.

      Week of Aug 16 -Aug 20, 2021

      • Look for us to take security one step further by adding authorization features for service-to-service communications for gRPC proxyless services, as well as to support other deployment models, where proxyless gRPC services are running somewhere other than GKE, for example Compute Engine. We hope you'll join us and check out the setup guide and give us feedback.
      • Cloud Run now supports VPC Service Controls. You can now protect your Cloud Run services against data exfiltration by using VPC Service Controls in conjunction with Cloud Run’s ingress and egress settings. Read more.
      • Read how retailers are leveraging Google Cloud VMware Engine to move their on-premises applications to the cloud, where they can achieve the scale, intelligence, and speed required to stay relevant and competitive. Read more.
      • A series of new features for BeyondCorp Enterprise, our zero trust offering. We now offer native support for client certificates for eight types of VPC-SC resources. We are also announcing general availability of the on-prem connector, which allows users to secure HTTP or HTTPS based on-premises applications outside of Google Cloud. Additionally, three new BeyondCorp attributes are available in Access Context Manager as part of a public preview. Customers can configure custom access policies based on time and date, credential strength, and/or Chrome browser attributes. Read more about these announcements here.
      • We are excited to announce that Google Cloud, working with its partners NAG and DDN, demonstrated the highest performing Lustre file system on the IO500 ranking of the fastest HPC storage systems — quite a feat considering Lustre is one of the most widely deployed HPC file systems in the world.  Read the full article.
      • The Storage Transfer Service for on-premises data API is now available in Preview. Now you can use RESTful APIs to automate your on-prem-to-cloud transfer workflows.  Storage Transfer Service is a software service to transfer data over a network. The service provides built-in capabilities such as scheduling, bandwidth management, retries, and data integrity checks that simplifies the data transfer workflow.
      • It is now simple to use Terraform to configure Anthos features on your GKE clusters. This is the first part of the 3 part series that describes using Terraform to enable Config Sync.  For platform administrators,  this natural, IaC approach improves auditability and transparency and reduces risk of misconfigurations or security gaps. Read more.
      • In this commissioned study, “Modernize With AIOps To Maximize Your Impact”, Forrester Consulting surveyed organizations worldwide to better understand how they’re approaching artificial intelligence for IT operations (AIOps) in their cloud environments, and what kind of benefits they’re seeing. Read more.
      • If your organization or development environment has strict security policies which don’t allow for external IPs, it can be difficult to set up a connection between a Private Cloud SQL instance and a Private IP VM. This article contains clear instructions on how to set up a connection from a private Compute Engine VM to a private Cloud SQL instance using a private service connection and the mysqlsh command line tool.

      Week of Aug 9 -Aug 13, 2021

      • Compute Engine users have a new, updated set of VM-level “in-context” metrics, charts, and logs to correlate signals for common troubleshooting scenarios across CPU, Disk, Memory, Networking, and live Processes.  This brings the best of Google Cloud’s operations suite directly to the Compute Engine UI. Learn more.
      • ​​Pub/Sub to Splunk Dataflow template has been updatedto address multiple enterprise customer asks, from improved compatibility with Splunk Add-on for Google Cloud Platform, to more extensibility with user-defined functions (UDFs), and general pipeline reliability enhancements to tolerate failures like transient network issues when delivering data to Splunk. Read more to learn about how to take advantage of these latest features. Read more.
      • Google Cloud and NVIDIA have teamed up to make VR/AR workloads easier, faster to create and tetherless! Read more.
      • Register for the Google Cloud Startup Summit, September 9, 2021 at goo.gle/StartupSummit for a digital event filled with inspiration, learning, and discussion. This event will bring together our startup and VC community to discuss the latest trends and insights, headlined by a keynote by Astro Teller, Captain of Moonshots at X the moonshot factory. Additionally, learn from a variety of technical and business sessions to help take your startup to the next level.
      • Google Cloud and Harris Poll healthcare research reveals COVID-19 impacts on healthcare technology. Learn more.
      • Partial SSO is now available for public preview. If you use a 3rd party identity provider to single sign on into Google services, Partial SSO allows you to identify a subset of your users to use Google / Cloud Identity as your SAML SSO identity provider (short video and demo).

      Week of Aug 2-Aug 6, 2021

      • Gartner named Google Cloud a Leader in the 2021 Magic Quadrant for Cloud Infrastructure and Platform Services, formerly Infrastructure as a Service. Learn more.
      • Private Service Connect is now generally available. Private Service Connect lets you create private and secure connections to Google Cloud and third-party services with service endpoints in your VPCs. Read more.
      • 30 migration guides designed to help you identify the best ways to migrate, which include meeting common organizational goals like minimizing time and risk during your migration, identifying the most enterprise-grade infrastructure for your workloads, picking a cloud that aligns with your organization’s sustainability goals, and more. Read more.

      Week of Jul 26-Jul 30, 2021

      • This week we hosting our Retail & Consumer Goods Summit, a digital event dedicated to helping leading retailers and brands digitally transform their business. Read more about our consumer packaged goods strategy and a guide to key summit content for brands in this blog from Giusy Buonfantino, Google Cloud’s Vice President of CPG.

      • We’re hosting our Retail & Consumer Goods Summit, a digital event dedicated to helping leading retailers and brands digitally transform their business. Read more.

      • See how IKEA uses Recommendations AI to provide customers with more relevant product information. Read more.

      • ​​Google Cloud launches a career program for people with autism designed to hire and support more talented people with autism in the rapidly growing cloud industry. Learn more

      • Google Cloud follows new API stability tenets that work to minimize unexpected deprecations to our Enterprise APIs. Read more.

      Week of Jul 19-Jul 23, 2021

      • Register and join us for Google Cloud Next, October 12-14, 2021 at g.co/CloudNext for a fresh approach to digital transformation, as well as a few surprises. Next ‘21 will be a fully customizable digital adventure for a more personalized learning journey. Find the tools and training you need to succeed. From live, interactive Q&As and informative breakout sessions to educational demos and real-life applications of the latest tech from Google Cloud. Get ready to plug into your cloud community, get informed, and be inspired. Together we can tackle today’s greatest business challenges, and start solving for what’s next.
      • "Application Innovation" takes a front row seat this year– To stay ahead of rising customer expectations and the digital and in-person hybrid landscape, enterprises must know what application innovation means and how to deliver this type of innovation with a small piece of technology that might surprise you. Learn more about the three pillars of app innovation here.
      • We announced Cloud IDS, our new network security offering, which is now available in preview. Cloud IDS delivers easy-to-use, cloud-native, managed, network-based threat detection. With Cloud IDS, customers can enjoy a Google Cloud-integrated experience, built with Palo Alto Networks’ industry-leading threat detection technologies to provide high levels of security efficacy. Learn more.
      • Key Visualizer for Cloud Spanner is now generally available. Key Visualizer is a new interactive monitoring tool that lets developers and administrators analyze usage patterns in Spanner. It reveals trends and outliers in key performance and resource metrics for databases of any size, helping to optimize queries and reduce infrastructure costs. See it in action.
      • The market for healthcare cloud is projected to grow 43%. This means a need for better tech infrastructure, digital transformation & Cloud tools. Learn how Google Cloud Partner Advantage partners help customers solve business challenges in healthcare.

      Week of Jul 12-Jul 16, 2021

      • Simplify VM migrations with Migrate for Compute Engine as a Service: delivers a Google-managed cloud service that enables simple, frictionless, and large-scale enterprise migrations of virtual machines to Google Compute Engine with minimal downtime and risk. API-driven and integrated into your Google Cloud console for ease of use, this service uses agent-less replication to copy data without manual intervention and without VPN requirements. It also enables you to launch non-disruptive validations of your VMs prior to cutover.  Rapidly migrate a single application or execute a sprint with hundred systems using migration groups with confidence. Read more here.
      • The Google Cloud region in Delhi NCR is now open for business, ready to host your workloads. Learn more and watch the region launch event here.
      • Introducing Quilkin: the open-source game server proxy. Developed in collaboration with Embark Studios, Quilkin is an open source UDP proxy, tailor-made for high performance real-time multiplayer games. Read more.
      • We’re making Google Glass on Meet available to a wider network of global customers. Learn more.
      • Transfer Appliance supports Google Managed Encryption Keys — We’re announcing the support for Google Managed Encryption Keys with Transfer Appliance, this is in addition to the currently available Customer Managed Encryption Keys feature. Customers have asked for the Transfer Appliance service to create and manage encryption keys for transfer sessions to improve usability and maintain security. The Transfer Appliance Service can now manage the encryption keys for the customers who do not wish to handle a key themselves. Learn more about Using Google Managed Encryption Keys.

      • UCLA builds a campus-wide API program– With Google Cloud's API management platform, Apigee, UCLA created a unified and strong API foundation that removes data friction that students, faculty, and administrators alike face. This foundation not only simplifies how various personas connect to data, but also encourages more innovations in the future. Learn their story.

      • An enhanced region picker makes it easy to choose a Google Cloud region with the lowest CO2 outputLearn more.
      • Amwell and Google Cloud explore five ways telehealth can help democratize access to healthcareRead more.
      • Major League Baseball and Kaggle launch ML competition to learn about fan engagement. Batter up!
      • We’re rolling out general support of Brand Indicators for Message Identification (BIMI) in Gmail within Google Workspace. Learn more.

      • Learn how DeNA Sports Business created an operational status visualization system that helps determine whether live event attendees have correctly installed Japan’s coronavirus contact tracing app COCOA.

      • Google Cloud CAS provides a highly scalable and available private CA to address the unprecedented growth in certificates in the digital world. Read more about CAS.

      Week of Jul 5-Jul 9, 2021

      • Google Cloud and Call of Duty League launch ActivStat to bring fans, players, and commentators the power of competitive statistics in real-time. Read more.
      • Building applications is a heavy lift due to the technical complexity, which includes the complexity of backend services that are used to manage and store data. Firestore alters this by having Google Cloud manage your backend complexity through a complete backend-as-a-service! Learn more.
      • Google Cloud’s new Native App Development skills challenge lets you earn badges that demonstrate your ability to create cloud-native apps. Read more and sign up.

      Week of Jun 28-Jul 2, 2021

      • Storage Transfer Service now offers preview support for Integration with AWS Security Token Service. Security conscious customers can now use Storage Transfer Service to perform transfers from AWS S3 without passing any security credentials. This release will alleviate the security burden associated with passing long-term AWS S3 credentials, which have to be rotated or explicitly revoked when they are no longer needed. Read more.
      • The most popular and surging Google Search terms are now available in BigQuery as a public dataset. View the Top 25 and Top 25 rising queries from Google Trends from the past 30-days, including 5 years of historical data across the 210 Designated Market Areas (DMAs) in the US. Learn more.
      • A new predictive autoscaling capability lets you add additional Compute Engine VMs in anticipation of forecasted demand. Predictive autoscaling is generally available across all Google Cloud regions. Read more or consult the documentation for more information on how to configure, simulate and monitor predictive autoscaling.
      • Messages by Google is now the default messaging app for all AT&T customers using Android phones in the United States. Read more.
      • TPU v4 Pods will soon be available on Google Cloud, providing the most powerful publicly available computing platform for machine learning training. Learn more.
      • Cloud SQL for SQL Server has addressed multiple enterprise customer asks with the GA releases of both SQL Server 2019 and Active Directory integration, as well as the Preview release of Cross Region Replicas.  This set of releases work in concert to allow customers to set up a more scalable and secure managed SQL Server environment to address their workloads’ needs. Read more.

      Week of Jun 21-Jun 25, 2021

      • Simplified return-to-office with no-code technologyWe've just released a solution to your most common return-to-office headaches: make a no-code app customized to solve your business-specific challenges. Learn how to create an automated app where employees can see office room occupancy, check what desks are reserved or open, review disinfection schedules, and more in this blog tutorial.
      • New technical validation whitepaper for running ecommerce applications—Enterprise Strategy Group's analyst outlines the challenges of organizations running ecommerce applications and how Google Cloud helps to mitigate those challenges and handle changing demands with global infrastructure solutions. Download the whitepaper.
      • The fullagendafor Google for Games Developer Summit on July 12th-13th, 2021 is now available. A free digital event with announcements from teams including Stadia, Google Ads, AdMob, Android, Google Play, Firebase, Chrome, YouTube, and Google Cloud. Hear more about how Google Cloud technology creates opportunities for gaming companies to make lasting enhancements for players and creatives. Register at g.co/gamedevsummit
      • BigQuery row-level security is now generally available, giving customers a way to control access to subsets of data in the same table for different groups of users. Row-level security (RLS) extends the principle of least privilege access and enables fine-grained access control policies in BigQuery tables. BigQuery currently supports access controls at the project-, dataset-, table- and column-level. Adding RLS to the portfolio of access controls now enables customers to filter and define access to specific rows in a table based on qualifying user conditions—providing much needed peace of mind for data professionals.
      • Transfer from Azure ADLS Gen 2: Storage Transfer Service offers Preview support for transferring data from Azure ADLS Gen 2 to Google Cloud Storage. Take advantage of a scalable, serverless service to handle data transfer. Read more.
      • reCAPTCHA V2 and V3 customers can now migrate site keys to reCAPTCHA Enterprise in under 10 minutes and without making any code changes. Watch our Webinar to learn more. 
      • Bot attacks are the biggest threat to your business that you probably haven’t addressed yet. Check out our Forbes article to see what you can do about it.

      Week of Jun 14-Jun 18, 2021

      • A new VM family for scale-out workloads—New AMD-based Tau VMs offer 56% higher absolute performance and 42% higher price-performance compared to general-purpose VMs from any of the leading public cloud vendors. Learn more.
      • New whitepaper helps customers plot their cloud migrations—Our new whitepaper distills the conversations we’ve had with CIOs, CTOs, and their technical staff into several frameworks that can help cut through the hype and the technical complexity to help devise the strategy that empowers both the business and IT. Read more or download the whitepaper.
      • Ubuntu Pro lands on Google Cloud—The general availability of Ubuntu Pro images on Google Cloud gives customers an improved Ubuntu experience, expanded security coverage, and integration with critical Google Cloud features. Read more.
      • Navigating hybrid work with a single, connected experience in Google Workspace—New additions to Google Workspace help businesses navigate the challenges of hybrid work, such as Companion Mode for Google Meet calls. Read more.
      • Arab Bank embraces Google Cloud technology—This Middle Eastern bank now offers innovative apps and services to their customers and employees with Apigee and Anthos. In fact, Arab Bank reports over 90% of their new-to-bank customers are using their mobile apps. Learn more.
      • Google Workspace for the Public Sector Sector events—This June, learn about Google Workspace tips and tricks to help you get things done. Join us for one or more of our learning events tailored for government and higher education users. Learn more.

      Week of Jun 7-Jun 11, 2021

      • The top cloud capabilities industry leaders want for sustained innovation—Multicloud and hybrid cloud approaches, coupled with open-source technology adoption, enable IT teams to take full advantage of the best cloud has to offer. Our recent study with IDG shows just how much of a priority this has become for business leaders. Read more or download the report.
      • Announcing the Firmina subsea cable—Planned to run from the East Coast of the United States to Las Toninas, Argentina, with additional landings in Praia Grande, Brazil, and Punta del Este, Uruguay, Firmina will be the longest open subsea cable in the world capable of running entirely from a single power source at one end of the cable if its other power source(s) become temporarily unavailable—a resilience boost at a time when reliable connectivity is more important than ever. Read more.
      • New research reveals what’s needed for AI acceleration in manufacturing—According to our data, which polled more than 1,000 senior manufacturing executives across seven countries, 76% have turned to digital enablers and disruptive technologies due to the pandemic such as data and analytics, cloud, and artificial intelligence (AI). And 66% of manufacturers who use AI in their day-to-day operations report that their reliance on AI is increasing. Read more or download the report.
      • Cloud SQL offers even faster maintenance—Cloud SQL maintenance is zippier than ever. MySQL and PostgreSQL planned maintenance typically lasts less than 60 seconds and SQL Server maintenance typically lasts less than 120 seconds. You can learn more about maintenance here.
      • Simplifying Transfer Appliance configuration with Cloud Setup Application—We’re announcing the availability of the Transfer Appliance Cloud Setup Application. This will use the information you provide through simple prompts and configure your Google Cloud permissions, preferred Cloud Storage bucket, and Cloud KMS key for your transfer. Several cloud console based manual steps are now simplified with a command line experience. Read more
      • Google Cloud VMware Engine is now HIPAA compliant—As of April 1, 2021, Google Cloud VMware Engine is covered under the Google Cloud Business Associate Agreement (BAA), meaning it has achieved HIPAA compliance. Healthcare organizations can now migrate and run their HIPAA-compliant VMware workloads in a fully compatible VMware Cloud Verified stack running natively in Google Cloud with Google Cloud VMware Engine, without changes or re-architecture to tools, processes, or applications. Read more.
      • Introducing container-native Cloud DNS—Kubernetes networking almost always starts with a DNS request. DNS has broad impacts on your application and cluster performance, scalability, and resilience. That is why we are excited to announce the release of container-native Cloud DNS—the native integration of Cloud DNS with Google Kubernetes Engine (GKE) to provide in-cluster Service DNS resolution with Cloud DNS, our scalable and full-featured DNS service. Read more.
      • Welcoming the EU’s new Standard Contractual Clauses for cross-border data transfers—Learn how we’re incorporating the new Standard Contractual Clauses (SCCs) into our contracts to help protect our customers’ data and meet the requirements of European privacy legislation. Read more.
      • Lowe’s meets customer demand with Google SRE practices—Learn how Low’s has been able to increase the number of releases they can support by adopting Google’s Site Reliability Engineering (SRE) framework and leveraging their partnership with Google Cloud. Read more.
      • What’s next for SAP on Google Cloud at SAPPHIRE NOW and beyond—As SAP’s SAPPHIRE conference begins this week, we believe businesses have a more significant opportunity than ever to build for their next decade of growth and beyond. Learn more on how we’re working together with our customers, SAP, and our partners to support this transformation. Read more.
      • Support for Node.js, Python and Java repositories for Artifact Registrynow in Preview–With today’s announcement, you can not only use Artifact Registry to secure and distribute container images, but also manage and secure your other software artifacts. Read more.
      • What’s next for SAP on Google Cloud at SAPPHIRE NOW and beyond—As SAP’s SAPPHIRE conference begins this week, we believe businesses have a more significant opportunity than ever to build for their next decade of growth and beyond. Learn more on how we’re working together with our customers, SAP, and our partners to support this transformation. Read more.
      • Google named a Leader in The Forrester Wave: Streaming Analytics, Q2 2021 report–Learn about the criteria where Google Dataflow was rated 5 out 5 and why this matters for our customers here.
      • Applied ML Summit this Thursday, June 10–Watch our keynote to learn about predictions for machine learning over the next decade. Engage with distinguished researchers, leading practitioners, and Kaggle Grandmasters during our live Ask Me Anything session. Take part in our modeling workshops to learn how you can iterate faster, and deploy and manage your models with confidence–no matter your level of formal computer science training. Learn how to develop and apply your professional skills, grow your abilities at the pace of innovation, and take your career to the next level. Register now.

      Week of May 31-Jun 4, 2021

      • Security Command Center now supports CIS 1.1 benchmarks and granular access controlSecurity Command Center (SCC) now supports CIS benchmarks for Google Cloud Platform Foundation v1.1, enabling you to monitor and address compliance violations against industry best practices in your Google Cloud environment. Additionally, SCC now supports fine-grained access control for administrators that allows you to easily adhere to the principles of least privilege—restricting access based on roles and responsibilities to reduce risk and enabling broader team engagement to address security. Read more.
      • Zero-trust managed security for services with Traffic Director–We created Traffic Director to bring to you a fully managed service mesh product that includes load balancing, traffic management and service discovery. And now, we’re happy to announce the availability of a fully-managed zero-trust security solution using Traffic Director with Google Kubernetes Engine (GKE) and Certificate Authority (CA) Service. Read more.
      • How one business modernized their data warehouse for customer success–PedidosYa migrated from their old data warehouse to Google Cloud's BigQuery. Now with BigQuery, the Latin American online food ordering company has reduced the total cost per query by 5x. Learn more.
      • Announcing new Cloud TPU VMs–New Cloud TPU VMs make it easier to use our industry-leading TPU hardware by providing direct access to TPU host machines, offering a new and improved user experience to develop and deploy TensorFlow, PyTorch, and JAX on Cloud TPUs. Read more.
      • Introducing logical replication and decoding for Cloud SQL for PostgreSQL–We’re announcing the public preview of logical replication and decoding for Cloud SQL for PostgreSQL. By releasing those capabilities and enabling change data capture (CDC) from Cloud SQL for PostgreSQL, we strengthen our commitment to building an open database platform that meets critical application requirements and integrates seamlessly with the PostgreSQL ecosystem. Read more.
      • How 6 businesses are transforming with SAP on Google Cloud–Thousands of organizations globally rely on SAP for their most mission critical workloads. And for many Google Cloud customers, part of a broader digital transformation journey has included accelerating the migration of these essential SAP workloads to Google Cloud for greater agility, elasticity, and uptime. Read six of their stories.

      Week of May 24-May 28, 2021

      • Google Cloud for financial services: driving your transformation cloud journey–As we welcome the industry to our Financial Services Summit, we’re sharing more on how Google Cloud accelerates a financial organization’s digital transformation through app and infrastructure modernization, data democratization, people connections, and trusted transactions. Read more or watch the summit on demand.
      • Introducing Datashare solution for financial services–We announced the general availability of Datashare for financial services, a new Google Cloud solution that brings together the entire capital markets ecosystem—data publishers and data consumers—to exchange market data securely and easily. Read more.
      • Announcing Datastream in PreviewDatastream, a serverless change data capture (CDC) and replication service, allows enterprises to synchronize data across heterogeneous databases, storage systems, and applications reliably and with minimal latency to support real-time analytics, database replication, and event-driven architectures. Read more.
      • Introducing Dataplex: An intelligent data fabric for analytics at scaleDataplex provides a way to centrally manage, monitor, and govern your data across data lakes, data warehouses and data marts, and make this data securely accessible to a variety of analytics and data science tools. Read more
      • Announcing Dataflow Prime–Available in Preview in Q3 2021, Dataflow Prime is a new platform based on a serverless, no-ops, auto-tuning architecture built to bring unparalleled resource utilization and radical operational simplicity to big data processing. Dataflow Prime builds on Dataflow and brings new user benefits with innovations in resource utilization and distributed diagnostics. The new capabilities in Dataflow significantly reduce the time spent on infrastructure sizing and tuning tasks, as well as time spent diagnosing data freshness problems. Read more.
      • Secure and scalable sharing for data and analytics with Analytics Hub–With Analytics Hub, available in Preview in Q3, organizations get a rich data ecosystem by publishing and subscribing to analytics-ready datasets; control and monitoring over how their data is being used; a self-service way to access valuable and trusted data assets; and an easy way to monetize their data assets without the overhead of building and managing the infrastructure. Read more.
      • Cloud Spanner trims entry cost by 90%–Coming soon to Preview, granular instance sizing in Spanner lets organizations run workloads at as low as 1/10th the cost of regular instances, equating to approximately $65/month. Read more.
      • Cloud Bigtable lifts SLA and adds new security features for regulated industries–Bigtable instances with a multi-cluster routing policy across 3 or more regions are now covered by a 99.999% monthly uptime percentage under the new SLA. In addition, new Data Access audit logs can help determine whether sensitive customer information has been accessed in the event of a security incident, and if so, when, and by whom. Read more.
      • Build a no-code journaling app–In honor of Mental Health Awareness Month, Google Cloud's no-code application development platform, AppSheet, demonstrates how you can build a journaling app complete with titles, time stamps, mood entries, and more. Learn how with this blog and video here.
      • New features in Security Command Center—On May 24th, Security Command Center Premium launched the general availability of granular access controls at project- and folder-level and Center for Internet Security (CIS) 1.1 benchmarks for Google Cloud Platform Foundation. These new capabilities enable organizations to improve their security posture and efficiently manage risk for their Google Cloud environment. Learn more.
      • Simplified API operations with AI–Google Cloud's API management platform Apigee applies Google's industry leading ML and AI to your API metadata. Understand how it works with anomaly detection here.
      • This week: Data Cloud and Financial Services Summits–Our Google Cloud Summit series begins this week with the Data Cloud Summit on Wednesday May 26 (Global). At this half-day event, you’ll learn how leading companies like PayPal, Workday, Equifax, and many others are driving competitive differentiation using Google Cloud technologies to build their data clouds and transform data into value that drives innovation. The following day, Thursday May 27 (Global & EMEA) at the Financial Services Summit, discover how Google Cloud is helping financial institutions such as PayPal, Global Payments, HSBC, Credit Suisse, AXA Switzerland and more unlock new possibilities and accelerate business through innovation. Read more and explore the entire summit series.
      • Announcing the Google for Games Developer Summit 2021 on July 12th-13th–With a surge of new gamers and an increase in time spent playing games in the last year, it’s more important than ever for game developers to delight and engage players. To help developers with this opportunity, the games teams at Google are back to announce the return of the Google for Games Developer Summit 2021 on July 12th-13th. Hear from experts across Google about new game solutions they’re building to make it easier for you to continue creating great games, connecting with players and scaling your business. Registration is free and open to all game developers. Register for the free online event at g.co/gamedevsummit to get more details in the coming weeks. We can’t wait to share our latest innovations with the developer community. Learn more.

      Week of May 17-May 21, 2021

      • Best practices to protect your organization against ransomware threats–For more than 20 years Google has been operating securely in the cloud, using our modern technology stack to provide a more defensible environment that we can protect at scale. While the threat of ransomware isn’t new, our responsibility to help protect you from existing or emerging threats never changes. In our recent blog post, we shared guidance on how organizations can increase their resilience to ransomware and how some of our Cloud products and services can help. Read more.

      • Forrester names Google Cloud a Leader in Unstructured Data Security Platforms–Forrester Research has named Google Cloud a Leader in The Forrester Wave: Unstructured Data Security Platforms, Q2 2021 report, and rated Google Cloud highest in the current offering category among the providers evaluated. Read more or download the report.
      • Introducing Vertex AI: One platform, every ML tool you needVertex AI is a managed machine learning (ML) platform that allows companies to accelerate the deployment and maintenance of artificial intelligence (AI) models. Read more.
      • Transforming collaboration in Google Workspace–We’re launching smart canvas, a new product experience that delivers the next evolution of collaboration for Google Workspace. Between now and the end of the year, we’re rolling out innovations that make it easier for people to stay connected, focus their time and attention, and transform their ideas into impact. Read more.
      • Developing next-generation geothermal power–At I/O this week, we announced a first-of-its-kind, next-generation geothermal project with clean-energy startup Fervo that will soon begin adding carbon-free energy to the electric grid that serves our data centers and infrastructure throughout Nevada, including our Cloud region in Las Vegas. Read more.
      • Contributing to an environment of trust and transparency in Europe–Google Cloud was one of the first cloud providers to support and adopt the EU GDPR Cloud Code of Conduct (CoC). The CoC is a mechanism for cloud providers to demonstrate how they offer sufficient guarantees to implement appropriate technical and organizational measures as data processors under the GDPR. This week, the Belgian Data Protection Authority, based on a positive opinion by the European Data Protection Board (EDPB), approved the CoC, a product of years of constructive collaboration between the cloud computing community, the European Commission, and European data protection authorities. We are proud to say that Google Cloud Platform and Google Workspace already adhere to these provisions. Learn more.
      • Announcing Google Cloud datasets solutions–We're adding commercial, synthetic, and first-party data to our Google Cloud Public Datasets Program to help organizations increase the value of their analytics and AI initiatives, and we're making available an open source reference architecture for a more streamlined data onboarding process to the program. Read more.
      • Introducing custom samples in Cloud Code–With new custom samples in Cloud Code, developers can quickly access your enterprise’s best code samples via a versioned Git repository directly from their IDEs. Read more.
      • Retention settings for Cloud SQL–Cloud SQL now allows you to configure backup retention settings to protect against data loss. You can retain between 1 and 365 days’ worth of automated backups and between 1 and 7 days’ worth of transaction logs for point-in-time recovery. See the details here.
      • Cloud developer’s guide to Google I/O 2021Google I/O may look a little different this year, but don’t worry, you’ll still get the same first-hand look at the newest launches and projects coming from Google. Best of all, it’s free and available to all (virtually) on May 18-20. Read more.

      Week of May 10-May 14, 2021

      • APIs and Apigee power modern day due diligence–With APIs and Google Cloud's Apigee, business due diligence company DueDil revolutionized the way they harness and share their Big Information Graph (B.I.G.) with partners and customers. Get the full story.
      • Cloud CISO Perspectives: May 2021–It’s been a busy month here at Google Cloud since our inaugural CISO perspectives blog post in April. Here, VP and CISO of Google Cloud Phil Venables recaps our cloud security and industry highlights, a sneak peak of what’s ahead from Google at RSA, and more. Read more.
      • 4 new features to secure your Cloud Run services–We announced several new ways to secure Cloud Run environments to make developing and deploying containerized applications easier for developers. Read more.
      • Maximize your Cloud Run investments with new committed use discounts–We’re introducing self-service spend-based committed use discounts for Cloud Run, which let you commit for a year to spending a certain amount on Cloud Run and benefiting from a 17% discount on the amount you committed. Read more.
      • Google Cloud Armor Managed Protection Plus is now generally available–Cloud Armor, our Distributed Denial of Service (DDoS) protection and Web-Application Firewall (WAF) service on Google Cloud, leverages the same infrastructure, network, and technology that has protected Google’s internet-facing properties from some of the largest attacks ever reported. These same tools protect customers’ infrastructure from DDoS attacks, which are increasing in both magnitude and complexity every year. Deployed at the very edge of our network, Cloud Armor absorbs malicious network- and protocol-based volumetric attacks, while mitigating the OWASP Top 10 risks and maintaining the availability of protected services. Read more.
      • Announcing Document Translation for Translation API Advanced in preview–Translation is critical to many developers and localization providers, whether you’re releasing a document, a piece of software, training materials or a website in multiple languages. With Document Translation, now you can directly translate documents in 100+ languages and formats such as Docx, PPTx, XLSx, and PDF while preserving document formatting. Read more.
      • Introducing BeyondCorp Enterprise protected profiles–Protected profiles enable users to securely access corporate resources from an unmanaged device with the same threat and data protections available in BeyondCorp Enterprise–all from the Chrome Browser. Read more.
      • How reCAPTCHA Enterprise protects unemployment and COVID-19 vaccination portals–With so many people visiting government websites to learn more about the COVID-19 vaccine, make vaccine appointments, or file for unemployment, these web pages have become prime targets for bot attacks and other abusive activities. But reCAPTCHA Enterprise has helped state governments protect COVID-19 vaccine registration portals and unemployment claims portals from abusive activities. Learn more.
      • Day one with Anthos? Here are 6 ideas for how to get started–Once you have your new application platform in place, there are some things you can do to immediately get value and gain momentum. Here are six things you can do to get you started. Read more.
      • The era of the transformation cloud is here–Google Cloud’s president Rob Enslin shares how the era of the transformation cloud has seen organizations move beyond data centers to change not only where their business is done but, more importantly, how it is done. Read more.

      Week of May 3-May 7, 2021

      • Transforming hard-disk drive maintenance with predictive ML–In collaboration with Seagate, we developed a machine learning system that can forecast the probability of a recurring failing disk—a disk that fails or has experienced three or more problems in 30 days. Learn how we did it.
      • Agent Assist for Chat is now in public previewAgent Assist provides your human agents with continuous support during their calls, and now chats, by identifying the customers’ intent and providing them with real-time recommendations such as articles and FAQs as well as responses to customer messages to more effectively resolve the conversation. Read more.
      • New Google Cloud, AWS, and Azure product map–Our updated product map helps you understand similar offerings from Google Cloud, AWS, and Azure, and you can easily filter the list by product name or other common keywords. Read more or view the map.
      • Join our Google Cloud Security Talks on May 12th–We’ll share expert insights into how we’re working to be your most trusted cloud. Find the list of topics we’ll cover here.
      • Databricks is now GA on Google Cloud–Deploy or migrate Databricks Lakehouse to Google Cloud to combine the benefits of an open data cloud platform with greater analytics flexibility, unified infrastructure management, and optimized performance. Read more.
      • HPC VM image is now GA–The CentOS-based HPC VM image makes it quick and easy to create HPC-ready VMs on Google Cloud that are pre-tuned for optimal performance. Check out our documentation and quickstart guide to start creating instances using the HPC VM image today.
      • Take the 2021 State of DevOps survey–Help us shape the future of DevOps and make your voice heard by completing the 2021 State of DevOps survey before June 11, 2021. Read more or take the survey.
      • OpenTelemetry Trace 1.0 is now available–OpenTelemetry has reached a key milestone: the OpenTelemetry Tracing Specification has reached version 1.0. API and SDK release candidates are available for Java, Erlang, Python, Go, Node.js, and .Net. Additional languages will follow over the next few weeks. Read more.
      • New blueprint helps secure confidential data in AI Platform Notebooks–We’re adding to our portfolio of blueprints with the publication of our Protecting confidential data in AI Platform Notebooks blueprint guide and deployable blueprint, which can help you apply data governance and security policies that protect your AI Platform Notebooks containing confidential data. Read more.
      • The Liquibase Cloud Spanner extension is now GALiquibase, an open-source library that works with a wide variety of databases, can be used for tracking, managing, and automating database schema changes. By providing the ability to integrate databases into your CI/CD process, Liquibase helps you more fully adopt DevOps practices. The Liquibase Cloud Spanner extension allows developers to use Liquibase's open-source database library to manage and automate schema changes in Cloud Spanner. Read more.
      • Cloud computing 101: Frequently asked questions–There are a number of terms and concepts in cloud computing, and not everyone is familiar with all of them. To help, we’ve put together a list of common questions, and the meanings of a few of those acronyms. Read more.

      Week of Apr 26-Apr 30, 2021

      • Announcing the GKE Gateway controller, in Preview–GKE Gateway controller, Google Cloud’s implementation of the Gateway API, manages internal and external HTTP/S load balancing for a GKE cluster or a fleet of GKE clusters and provides multi-tenant sharing of load balancer infrastructure with centralized admin policy and control. Read more.
      • See Network Performance for Google Cloud in Performance Dashboard–The Google Cloud performance view, part of the Network Intelligence Center, provides packet loss and latency metrics for traffic on Google Cloud. It allows users to do informed planning of their deployment architecture, as well as determine in real time the answer to the most common troubleshooting question: "Is it Google or is it me?" The Google Cloud performance view is now open for all Google Cloud customers as a public preview. Check it out.
      • Optimizing data in Google Sheets allows users to create no-code apps–Format columns and tables in Google Sheets to best position your data to transform into a fully customized, successful app–no coding necessary. Read our four best Google Sheets tips.
      • Automation bots with AppSheet Automation–AppSheet recently released AppSheet Automation, infusing Google AI capabilities to AppSheet's trusted no-code app development platform. Learn step by step how to build your first automation bot on AppSheet here.
      • Google Cloud announces a new region in Israel–Our new region in Israel will make it easier for customers to serve their own users faster, more reliably and securely. Read more.
      • New multi-instance NVIDIA GPUs on GKE–We’re launching support for multi-instance GPUs in GKE (currently in Preview), which will help you drive better value from your GPU investments. Read more.
      • Partnering with NSF to advance networking innovation–We announced our partnership with the U.S. National Science Foundation (NSF), joining other industry partners and federal agencies, as part of a combined $40 million investment in academic research for Resilient and Intelligent Next-Generation (NextG) Systems, or RINGS. Read more.
      • Creating a policy contract with Configuration as Data–Configuration as Data is an emerging cloud infrastructure management paradigm that allows developers to declare the desired state of their applications and infrastructure, without specifying the precise actions or steps for how to achieve it. However, declaring a configuration is only half the battle: you also want policy that defines how a configuration is to be used. This post shows you how.
      • Google Cloud products deliver real-time data solutions–Seven-Eleven Japan built Seven Central, its new platform for digital transformation, on Google Cloud. Powered by BigQuery, Cloud Spanner, and Apigee API management, Seven Central presents easy to understand data, ultimately allowing for quickly informed decisions. Read their story here.

      Week of Apr 19-Apr 23, 2021

      • Extreme PD is now GA–On April 20th, Google Cloud’s Persistent Disk launched general availability of Extreme PD, a high performance block storage volume with provisioned IOPS and up to 2.2 GB/s of throughput. Learn more.

      • Research: How data analytics and intelligence tools to play a key role post-COVID-19–A recent Google-commissioned study by IDG highlighted the role of data analytics and intelligent solutions when it comes to helping businesses separate from their competition. The survey of 2,000 IT leaders across the globe reinforced the notion that the ability to derive insights from data will go a long way towards determining which companies win in this new era. Learn more or download the study.

      • Introducing PHP on Cloud Functions–We’re bringing support for PHP, a popular general-purpose programming language, to Cloud Functions. With the Functions Framework for PHP, you can write idiomatic PHP functions to build business-critical applications and integration layers. And with Cloud Functions for PHP, now available in Preview, you can deploy functions in a fully managed PHP 7.4 environment, complete with access to resources in a private VPC network. Learn more.

      • Delivering our 2020 CCAG pooled audit–As our customers increased their use of cloud services to meet the demands of teleworking and aid in COVID-19 recovery, we’ve worked hard to meet our commitment to being the industry’s most trusted cloud, despite the global pandemic. We’re proud to announce that Google Cloud completed an annual pooled audit with the CCAG in a completely remote setting, and were the only cloud service provider to do so in 2020. Learn more.

      • Anthos 1.7 now available–We recently released Anthos 1.7, our run-anywhere Kubernetes platform that’s connected to Google Cloud, delivering an array of capabilities that make multicloud more accessible and sustainable. Learn more.

      • New Redis Enterprise for Anthos and GKE–We’re making Redis Enterprise for Anthos and Google Kubernetes Engine (GKE) available in the Google Cloud Marketplace in private preview. Learn more.

      • Updates to Google Meet–We introduced a refreshed user interface (UI), enhanced reliability features powered by the latest Google AI, and tools that make meetings more engaging—even fun—for everyone involved. Learn more.

      • DocAI solutions now generally availableDocument (Doc) AI platformLending DocAI and Procurement DocAI, built on decades of AI innovation at Google, bring powerful and useful solutions across lending, insurance, government and other industries. Learn more.

      • Four consecutive years of 100% renewable energy–In 2020, Google again matched 100 percent of its global electricity use with purchases of renewable energy. All told, we’ve signed agreements to buy power from more than 50 renewable energy projects, with a combined capacity of 5.5 gigawatts–about the same as a million solar rooftops. Learn more.

      • Announcing the Google Cloud region picker–The Google Cloud region picker lets you assess key inputs like price, latency to your end users, and carbon footprint to help you choose which Google Cloud region to run on. Learn more.

      • Google Cloud launches new security solution WAAP–WebApp and API Protection (WAAP) combines Google Cloud Armor, Apigee, and reCAPTCHA Enterprise to deliver improved threat protection, consolidated visibility, and greater operational efficiencies across clouds and on-premises environments. Learn more about WAAP here.
      • New in no-code–As discussed in our recent article, no-code hackathons are trending among innovative organizations. Since then, we've outlined how you can host one yourself specifically designed for your unique business innovation outcomes. Learn how here.
      • Google Cloud Referral Program now available—Now you can share the power of Google Cloud and earn product credit for every new paying customer you refer. Once you join the program, you’ll get a unique referral link that you can share with friends, clients, or others. Whenever someone signs up with your link, they’ll get a $350 product credit—that’s $50 more than the standard trial credit. When they become a paying customer, we’ll reward you with a $100 product credit in your Google Cloud account. Available in the United States, Canada, Brazil, and Japan. Apply for the Google Cloud Referral Program.

      Week of Apr 12-Apr 16, 2021

      • Announcing the Data Cloud Summit, May 26, 2021–At this half-day event, you’ll learn how leading companies like PayPal, Workday, Equifax, Zebra Technologies, Commonwealth Care Alliance and many others are driving competitive differentiation using Google Cloud technologies to build their data clouds and transform data into value that drives innovation. Learn more and register at no cost.
      • Announcing the Financial Services Summit, May 27, 2021–In this 2 hour event, you’ll learn how Google Cloud is helping financial institutions including PayPal, Global Payments, HSBC, Credit Suisse, and more unlock new possibilities and accelerate business through innovation and better customer experiences. Learn more and register for free: Global & EMEA.
      • How Google Cloud is enabling vaccine equity–In our latest update, we share more on how we’re working with US state governments to help produce equitable vaccination strategies at scale. Learn more.
      • The new Google Cloud region in Warsaw is open–The Google Cloud region in Warsaw is now ready for business, opening doors for organizations in Central and Eastern Europe. Learn more.
      • AppSheet Automation is now GA–Google Cloud’s AppSheet launches general availability of AppSheet Automation, a unified development experience for citizen and professional developers alike to build custom applications with automated processes, all without coding. Learn how companies and employees are reclaiming their time and talent with AppSheet Automation here.
      • Introducing SAP Integration with Cloud Data Fusion–Google Cloud native data integration platform Cloud Data Fusion now offers the capability to seamlessly get data out of SAP Business Suite, SAP ERP and S/4HANA. Learn more.

      Week of Apr 5-Apr 9, 2021

      • New Certificate Authority Service (CAS) whitepaper–“How to deploy a secure and reliable public key infrastructure with Google Cloud Certificate Authority Service” (written by Mark Cooper of PKI Solutions and Anoosh Saboori of Google Cloud) covers security and architectural recommendations for the use of the Google Cloud CAS by organizations, and describes critical concepts for securing and deploying a PKI based on CAS. Learn more or read the whitepaper.
      • Active Assist’s new feature, predictive autoscaling, helps improve response times for your applications–When you enable predictive autoscaling, Compute Engine forecasts future load based on your Managed Instance Group’s (MIG) history and scales it out in advance of predicted load, so that new instances are ready to serve when the load arrives. Without predictive autoscaling, an autoscaler can only scale a group reactively, based on observed changes in load in real time. With predictive autoscaling enabled, the autoscaler works with real-time data as well as with historical data to cover both the current and forecasted load. That makes predictive autoscaling ideal for those apps with long initialization times and whose workloads vary predictably with daily or weekly cycles. For more information, see How predictive autoscaling works or check if predictive autoscaling is suitable for your workload, and to learn more about other intelligent features, check out Active Assist.
      • Introducing Dataprep BigQuery pushdown–BigQuery pushdown gives you the flexibility to run jobs using either BigQuery or Dataflow. If you select BigQuery, then Dataprep can automatically determine if data pipelines can be partially or fully translated in a BigQuery SQL statement. Any portions of the pipeline that cannot be run in BigQuery are executed in Dataflow. Utilizing the power of BigQuery results in highly efficient data transformations, especially for manipulations such as filters, joins, unions, and aggregations. This leads to better performance, optimized costs, and increased security with IAM and OAuth support. Learn more.
      • Announcing the Google Cloud Retail & Consumer Goods Summit–The Google Cloud Retail & Consumer Goods Summit brings together technology and business insights, the key ingredients for any transformation. Whether you're responsible for IT, data analytics, supply chains, or marketing, please join! Building connections and sharing perspectives cross-functionally is important to reimagining yourself, your organization, or the world. Learn more or register for free.
      • New IDC whitepaper assesses multicloud as a risk mitigation strategy–To better understand the benefits and challenges associated with a multicloud approach, we supported IDC’s new whitepaper that investigates how multicloud can help regulated organizations mitigate the risks of using a single cloud vendor. The whitepaper looks at different approaches to multi-vendor and hybrid clouds taken by European organizations and how these strategies can help organizations address concentration risk and vendor-lock in, improve their compliance posture, and demonstrate an exit strategy. Learn more or download the paper.
      • Introducing request priorities for Cloud Spanner APIs–You can now specify request priorities for some Cloud Spanner APIs. By assigning a HIGH, MEDIUM, or LOW priority to a specific request, you can now convey the relative importance of workloads, to better align resource usage with performance objectives. Learn more.
      • How we’re working with governments on climate goals–Google Sustainability Officer Kate Brandt shares more on how we’re partnering with governments around the world to provide our technology and insights to drive progress in sustainability efforts. Learn more.

      Week of Mar 29-Apr 2, 2021

      • Why Google Cloud is the ideal platform for Block.one and other DLT companies–Late last year, Google Cloud joined the EOS community, a leading open-source platform for blockchain innovation and performance, and is taking steps to support the EOS Public Blockchain by becoming a block producer (BP). At the time, we outlined how our planned participation underscores the importance of blockchain to the future of business, government, and society. We're sharing more on why Google Cloud is uniquely positioned to be an excellent partner for Block.one and other distributed ledger technology (DLT) companies. Learn more.
      • New whitepaper: Scaling certificate management with Certificate Authority Service–As Google Cloud’s Certificate Authority Service (CAS) approaches general availability, we want to help customers understand the service better. Customers have asked us how CAS fits into our larger security story and how CAS works for various use cases. Our new white paper answers these questions and more. Learn more and download the paper.
      • Build a consistent approach for API consumers–Learn the differences between REST and GraphQL, as well as how to apply REST-based practices to GraphQL. No matter the approach, discover how to manage and treat both options as API products here.

      • Apigee X makes it simple to apply Cloud CDN to APIs–With Apigee X and Cloud CDN, organizations can expand their API programs' global reach. Learn how to deploy APIs across 24 regions and 73 zones here.

      • Enabling data migration with Transfer Appliances in APAC—We’re announcing the general availability of Transfer Appliances TA40/TA300 in Singapore. Customers are looking for fast, secure and easy to use options to migrate their workloads to Google Cloud and we are addressing their needs with Transfer Appliances globally in the US, EU and APAC. Learn more about Transfer Appliances TA40 and TA300.

      • Windows Authentication is now supported on Cloud SQL for SQL Server in public preview—We’ve launched seamless integration with Google Cloud’s Managed Service for Microsoft Active Directory (AD). This capability is a critical requirement to simplify identity management and streamline the migration of existing SQL Server workloads that rely on AD for access control. Learn more or get started.

      • Using Cloud AI to whip up new treats with Mars Maltesers—Maltesers, a popular British candy made by Mars, teamed up with our own AI baker and ML engineer extraordinaire, Sara Robinson, to create a brand new dessert recipe with Google Cloud AI. Find out what happened (recipe included).

      • Simplifying data lake management with Dataproc Metastore, now GADataproc Metastore, a fully managed, serverless technical metadata repository based on the Apache Hive metastore, is now generally available. Enterprises building and migrating open source data lakes to Google Cloud now have a central and persistent metastore for their open source data analytics frameworks. Learn more.

      • Introducing the Echo subsea cable—We announced our investment in Echo, the first-ever cable to directly connect the U.S. to Singapore with direct fiber pairs over an express route. Echo will run from Eureka, California to Singapore, with a stop-over in Guam, and plans to also land in Indonesia. Additional landings are possible in the future. Learn more.

      Week of Mar 22-Mar 26, 2021

      • 10 new videos bring Google Cloud to life—The Google Cloud Tech YouTube channel’s latest video series explains cloud tools for technical practitioners in about 5 minutes each. Learn more.
      • BigQuery named a Leader in the 2021 Forrester Wave: Cloud Data Warehouse, Q1 2021 report—Forrester gave BigQuery a score of 5 out of 5 across 19 different criteria. Learn more in our blog post, or download the report.
      • Charting the future of custom compute at Google—To meet users’ performance needs at low power, we’re doubling down on custom chips that use System on a Chip (SoC) designs. Learn more.
      • Introducing Network Connectivity Center—We announced Network Connectivity Center, which provides a single management experience to easily create, connect, and manage heterogeneous on-prem and cloud networks leveraging Google’s global infrastructure. Network Connectivity Center serves as a vantage point to seamlessly connect VPNs, partner and dedicated interconnects, as well as third-party routers and Software-Defined WANs, helping you optimize connectivity, reduce operational burden and lower costs—wherever your applications or users may be. Learn more.
      • Making it easier to get Compute Engine resources for batch processing—We announced a new method of obtaining Compute Engine instances for batch processing that accounts for availability of resources in zones of a region. Now available in preview for regional managed instance groups, you can do this simply by specifying the ANY value in the API. Learn more.
      • Next-gen virtual automotive showrooms are here, thanks to Google Cloud, Unreal Engine, and NVIDIA—We teamed up with Unreal Engine, the open and advanced real-time 3D creation game engine, and NVIDIA, inventor of the GPU, to launch new virtual showroom experiences for automakers. Taking advantage of the NVIDIA RTX platform on Google Cloud, these showrooms provide interactive 3D experiences, photorealistic materials and environments, and up to 4K cloud streaming on mobile and connected devices. Today, in collaboration with MHP, the Porsche IT consulting firm, and MONKEYWAY, a real-time 3D streaming solution provider, you can see our first virtual showroom, the Pagani Immersive Experience Platform. Learn more.
      • Troubleshoot network connectivity with Dynamic Verification (public preview)—You can now check packet loss rate and one-way network latency between two VMs on GCP. This capability is an addition to existing Network Intelligence Center Connectivity Tests which verify reachability by analyzing network configuration in your VPCs. See more in our documentation.
      • Helping U.S. states get the COVID-19 vaccine to more people—In February, we announced our Intelligent Vaccine Impact solution (IVIs) to help communities rise to the challenge of getting vaccines to more people quickly and effectively. Many states have deployed IVIs, and have found it able to meet demand and easily integrate with their existing technology infrastructures. Google Cloud is proud to partner with a number of states across the U.S., including Arizona, the Commonwealth of Massachusetts, North Carolina, Oregon, and the Commonwealth of Virginia to support vaccination efforts at scale. Learn more.

      Week of Mar 15-Mar 19, 2021

      • A2 VMs now GA: The largest GPU cloud instances with NVIDIA A100 GPUs—We’re announcing the general availability of A2 VMs based on the NVIDIA Ampere A100 Tensor Core GPUs in Compute Engine. This means customers around the world can now run their NVIDIA CUDA-enabled machine learning (ML) and high performance computing (HPC) scale-out and scale-up workloads more efficiently and at a lower cost. Learn more.
      • Earn the new Google Kubernetes Engine skill badge for free—We’ve added a new skill badge this month, Optimize Costs for Google Kubernetes Engine (GKE), which you can earn for free when you sign up for the Kubernetes track of the skills challenge. The skills challenge provides 30 days free access to Google Cloud labs and gives you the opportunity to earn skill badges to showcase different cloud competencies to employers. Learn more.
      • Now available: carbon free energy percentages for our Google Cloud regions—Google first achieved carbon neutrality in 2007, and since 2017 we’ve purchased enough solar and wind energy to match 100% of our global electricity consumption. Now we’re building on that progress to target a new sustainability goal: running our business on carbon-free energy 24/7, everywhere, by 2030. Beginning this week, we’re sharing data about how we are performing against that objective so our customers can select Google Cloud regions based on the carbon-free energy supplying them. Learn more.
      • Increasing bandwidth to C2 and N2 VMs—We announced the public preview of 100, 75, and 50 Gbps high-bandwidth network configurations for General Purpose N2 and Compute Optimized C2 Compute Engine VM families as part of continuous efforts to optimize our Andromeda host networking stack. This means we can now offer higher-bandwidth options on existing VM families when using the Google Virtual NIC (gVNIC). These VMs were previously limited to 32 Gbps. Learn more.
      • New research on how COVID-19 changed the nature of IT—To learn more about the impact of COVID-19 and the resulting implications to IT, Google commissioned a study by IDG to better understand how organizations are shifting their priorities in the wake of the pandemic. Learn more and download the report.

      • New in API security—Google Cloud Apigee API management platform's latest release, Apigee X, works with Cloud Armor to protect your APIs with advanced security technology including DDoS protection, geo-fencing, OAuth, and API keys. Learn more about our integrated security enhancements here.

      • Troubleshoot errors more quickly with Cloud Logging—The Logs Explorer now automatically breaks down your log results by severity, making it easy to spot spikes in errors at specific times. Learn more about our new histogram functionality here.

      Week of Mar 8-Mar 12, 2021

      • Introducing #AskGoogleCloud on Twitter and YouTube—Our first segment on March 12th features Developer Advocates Stephanie Wong, Martin Omander and James Ward to answer questions on the best workloads for serverless, the differences between “serverless” and “cloud native,” how to accurately estimate costs for using Cloud Run, and much more. Learn more.
      • Learn about the value of no-code hackathons—Google Cloud’s no-code application development platform, AppSheet, helps to facilitate hackathons for “non-technical” employees with no coding necessary to compete. Learn about Globe Telecom’s no-code hackathon as well as their winning AppSheet app here.
      • Introducing Cloud Code Secret Manager Integration—Secret Manager provides a central place and single source of truth to manage, access, and audit secrets across Google Cloud. Integrating Cloud Code with Secret Manager brings the powerful capabilities of both these tools together so you can create and manage your secrets right from within your preferred IDE, whether that be VS Code, IntelliJ, or Cloud Shell Editor. Learn more.
      • Flexible instance configurations in Cloud SQL—Cloud SQL for MySQL now supports flexible instance configurations which offer you the extra freedom to configure your instance with the specific number of vCPUs and GB of RAM that fits your workload. To set up a new instance with a flexible instance configuration, see our documentation here.
      • The Cloud Healthcare Consent Management API is now generally available—The Healthcare Consent Management API is now GA, giving customers the ability to greatly scale the management of consents to meet increasing need, particularly amidst the emerging task of managing health data for new care and research scenarios. Learn more.

      Week of Mar 1-Mar 5, 2021

      • Cloud Run is now available in all Google Cloud regions. Learn more.
      • Introducing Apache Spark Structured Streaming connector for Pub/Sub Lite—We’re announcing the release of an open source connector to read streams of messages from Pub/Sub Lite into Apache Spark.The connector works in all Apache Spark 2.4.X distributions, including Dataproc, Databricks, or manual Spark installations. Learn more.
      • Google Cloud Next ‘21 is October 12-14, 2021—Join us and learn how the most successful companies have transformed their businesses with Google Cloud. Sign-up at g.co/cloudnext for updates. Learn more.
      • Hierarchical firewall policies now GA—Hierarchical firewalls provide a means to enforce firewall rules at the organization and folder levels in the GCP Resource Hierarchy. This allows security administrators at different levels in the hierarchy to define and deploy consistent firewall rules across a number of projects so they're applied to all VMs in currently existing and yet-to-be-created projects. Learn more.
      • Announcing the Google Cloud Born-Digital Summit—Over this half-day event, we’ll highlight proven best-practice approaches to data, architecture, diversity & inclusion, and growth with Google Cloud solutions. Learn more and register for free.
      • Google Cloud products in 4 words or less (2021 edition)—Our popular “4 words or less Google Cloud developer’s cheat sheet” is back and updated for 2021. Learn more.
      • Gartner names Google a leader in its 2021 Magic Quadrant for Cloud AI Developer Services report—We believe this recognition is based on Gartner’s evaluation of Google Cloud’s language, vision, conversational, and structured data services and solutions for developers. Learn more.
      • Announcing the Risk Protection Program—The Risk Protection Program offers customers peace of mind through the technology to secure their data, the tools to monitor the security of that data, and an industry-first cyber policy offered by leading insurers. Learn more.
      • Building the future of work—We’re introducing new innovations in Google Workspace to help people collaborate and find more time and focus, wherever and however they work. Learn more.

      • Assured Controls and expanded Data Regions—We’ve added new information governance features in Google Workspace to help customers control their data based on their business goals. Learn more.

      Week of Feb 22-Feb 26, 2021

      • 21 Google Cloud tools explained in 2 minutes—Need a quick overview of Google Cloud core technologies? Quickly learn these 21 Google Cloud products—each explained in under two minutes. Learn more.

      • BigQuery materialized views now GA—Materialized views (MV’s) are precomputed views that periodically cache results of a query to provide customers increased performance and efficiency. Learn more.

      • New in BigQuery BI Engine—We’re extending BigQuery BI Engine to work with any BI or custom dashboarding applications that require sub-second query response times. In this preview, BI Engine will work seamlessly with Looker and other popular BI tools such as Tableau and Power BI without requiring any change to the BI tools. Learn more.

      • Dataproc now supports Shielded VMs—All Dataproc clusters created using Debian 10 or Ubuntu 18.04 operating systems now use Shielded VMs by default and customers can provide their own configurations for secure boot, vTPM, and Integrity Monitoring. This feature is just one of the many ways customers that have migrated their Hadoop and Spark clusters to GCP experience continued improvements to their security postures without any additional cost.

      • New Cloud Security Podcast by Google—Our new podcast brings you stories and insights on security in the cloud, delivering security from the cloud, and, of course, on what we’re doing at Google Cloud to help keep customer data safe and workloads secure. Learn more.

      • New in Conversational AI and Apigee technology—Australian retailer Woolworths provides seamless customer experiences with their virtual agent, Olive. Apigee API Management and Dialogflow technology allows customers to talk to Olive through voice and chat. Learn more.

      • Introducing GKE Autopilot—GKE already offers an industry-leading level of automation that makes setting up and operating a Kubernetes cluster easier and more cost effective than do-it-yourself and other managed offerings. Autopilot represents a significant leap forward. In addition to the fully managed control plane that GKE has always provided, using the Autopilot mode of operation automatically applies industry best practices and can eliminate all node management operations, maximizing your cluster efficiency and helping to provide a stronger security posture. Learn more.

      • Partnering with Intel to accelerate cloud-native 5G—As we continue to grow cloud-native services for the telecommunications industry, we’re excited to announce a collaboration with Intel to develop reference architectures and integrated solutions for communications service providers to accelerate their deployment of 5G and edge network solutions. Learn more.

      • Veeam Backup for Google Cloud now available—Veeam Backup for Google Cloud automates Google-native snapshots to securely protect VMs across projects and regions with ultra-low RPOs and RTOs, and store backups in Google Object Storage to enhance data protection while ensuring lower costs for long-term retention.

      • Migrate for Anthos 1.6 GA—With Migrate for Anthos, customers and partners can automatically migrate and modernize traditional application workloads running in VMs into containers running on Anthos or GKE. Included in this new release: 

        • In-place modernization for Anthos on AWS (Public Preview) to help customers accelerate on-boarding to Anthos AWS while leveraging their existing investment in AWS data sources, projects, VPCs, and IAM controls.

        • Additional Docker registries and artifacts repositories support (GA) including AWS ECR, basic-auth docker registries, and AWS S3 storage to provide further flexibility for customers using Anthos Anywhere (on-prem, AWS, etc). 

        • HTTPS Proxy support (GA) to enable M4A functionality (access to external image repos and other services) where a proxy is used to control external access.

      Week of Feb 15-Feb 19, 2021

      • Introducing Cloud Domains in preview—Cloud Domains simplify domain registration and management within Google Cloud, improve the custom domain experience for developers, increase security, and support stronger integrations around DNS and SSL. Learn more.

      • Announcing Databricks on Google Cloud—Our partnership with Databricks enables customers to accelerate Databricks implementations by simplifying their data access, by jointly giving them powerful ways to analyze their data, and by leveraging our combined AI and ML capabilities to impact business outcomes. Learn more.

      • Service Directory is GA—As the number and diversity of services grows, it becomes increasingly challenging to maintain an inventory of all of the services across an organization. Last year, we launched Service Directory to help simplify the problem of service management. Today, it’s generally available. Learn more.

      Week of Feb 8-Feb 12, 2021

      • Introducing Bare Metal Solution for SAP workloads—We’ve expanded our Bare Metal Solution—dedicated, single-tenant systems designed specifically to run workloads that are too large or otherwise unsuitable for standard, virtualized environments—to include SAP-certified hardware options, giving SAP customers great options for modernizing their biggest and most challenging workloads. Learn more.

      • 9TB SSDs bring ultimate IOPS/$ to Compute Engine VMs—You can now attach 6TB and 9TB Local SSD to second-generation general-purpose N2 Compute Engine VMs, for great IOPS per dollar. Learn more.

      • Supporting the Python ecosystem—As part of our longstanding support for the Python ecosystem, we are happy to increase our support for the Python Software Foundation, the non-profit behind the Python programming language, ecosystem and community. Learn more

      • Migrate to regional backend services for Network Load Balancing—We now support backend services with Network Load Balancing—a significant enhancement over the prior approach, target pools, providing a common unified data model for all our load-balancing family members and accelerating the delivery of exciting features on Network Load Balancing. Learn more.

      Week of Feb 1-Feb 4, 2021

      • Apigee launches Apigee X—Apigee celebrates its 10 year anniversary with Apigee X, a new release of the Apigee API management platform. Apigee X harnesses the best of Google technologies to accelerate and globalize your API-powered digital initiatives. Learn more about Apigee X and digital excellence here.
      • Celebrating the success of Black founders with Google Cloud during Black History Month—February is Black History Month, a time for us to come together to celebrate and remember the important people and history of the African heritage. Over the next four weeks, we will highlight four Black-led startups and how they use Google Cloud to grow their businesses. Our first feature highlights TQIntelligence and its founder, Yared.

      Week of Jan 25-Jan 29, 2021

      • BeyondCorp Enterprise now generally available—BeyondCorp Enterprise is a zero trust solution, built on Google’s global network, which provides customers with simple and secure access to applications and cloud resources and offers integrated threat and data protection. To learn more, read the blog post, visit our product homepage, and register for our upcoming webinar.

      Week of Jan 18-Jan 22, 2021

      • Cloud Operations Sandbox now available—Cloud Operations Sandbox is an open-source tool that helps you learn SRE practices from Google and apply them on cloud services using Google Cloud’s operations suite (formerly Stackdriver), with everything you need to get started in one click. You can read our blog post, or get started by visiting cloud-ops-sandbox.dev, exploring the project repo, and following along in the user guide

      • New data security strategy whitepaper—Our new whitepaper shares our best practices for how to deploy a modern and effective data security program in the cloud. Read the blog post or download the paper.   

      • WebSockets, HTTP/2 and gRPC bidirectional streams come to Cloud Run—With these capabilities, you can deploy new kinds of applications to Cloud Run that were not previously supported, while taking advantage of serverless infrastructure. These features are now available in public preview for all Cloud Run locations. Read the blog post or check out the WebSockets demo app or the sample h2c server app.

      • New tutorial: Build a no-code workout app in 5 steps—Looking to crush your new year’s resolutions? Using AppSheet, Google Cloud’s no-code app development platform, you can build a custom fitness app that can do things like record your sets, reps and weights, log your workouts, and show you how you’re progressing. Learn how.


      Week of Jan 11-Jan 15, 2021

      • State of API Economy 2021 Report now available—Google Cloud details the changing role of APIs in 2020 amidst the COVID-19 pandemic, informed by a comprehensive study of Apigee API usage behavior across industry, geography, enterprise size, and more. Discover these 2020 trends along with a projection of what to expect from APIs in 2021. Read our blog post here or download and read the report here.
      • New in the state of no-code—Google Cloud's AppSheet looks back at the key no-code application development themes of 2020. AppSheet contends the rising number of citizen developer app creators will ultimately change the state of no-code in 2021. Read more here.


      Week of Jan 4-Jan 8, 2021

      • Last year's most popular API posts—In an arduous year, thoughtful API design and strategy is critical to empowering developers and companies to use technology for global good. Google Cloud looks back at the must-read API posts in 2020. Read it here.


      Week of Dec 21-Dec 25, 2020


      Week of Dec 14-Dec 18, 2020

      • Memorystore for Redis enables TLS encryption support (Preview)—With this release, you can now use Memorystore for applications requiring sensitive data to be encrypted between the client and the Memorystore instance. Read more here.
      • Monitoring Query Language (MQL) for Cloud Monitoring is now generally available—Monitoring Query language provides developers and operators on IT and development teams powerful metric querying, analysis, charting, and alerting capabilities. This functionality is needed for Monitoring use cases that include troubleshooting outages, root cause analysis, custom SLI / SLO creation, reporting and analytics, complex alert logic, and more. Learn more.


      Week of Dec 7-Dec 11, 2020

      • Memorystore for Redis now supports Redis AUTH—With this release you can now use OSS Redis AUTH feature with Memorystore for Redis instances. Read more here.
      • New in serverless computing—Google Cloud API Gateway and its service-first approach to developing serverless APIs helps organizations accelerate innovation by eliminating scalability and security bottlenecks for their APIs. Discover more benefits here.
      • Environmental Dynamics, Inc. makes a big move to no-code—The environmental consulting company EDI built and deployed 35+ business apps with no coding skills necessary with Google Cloud’s AppSheet. This no-code effort not only empowered field workers, but also saved employees over 2,550 hours a year. Get the full story here.
      • Introducing Google Workspace for Government—Google Workspace for Government is an offering that brings the best of Google Cloud’s collaboration and communication tools to the government with pricing that meets the needs of the public sector. Whether it’s powering social care visits, employment support, or virtual courts, Google Workspace helps governments meet the unique challenges they face as they work to provide better services in an increasingly virtual world. Learn more.


      Week of Nov 30-Dec 4, 2020

      • Google enters agreement to acquire Actifio—Actifio, a leader in backup and disaster recovery (DR), offers customers the opportunity to protect virtual copies of data in their native format, manage these copies throughout their entire lifecycle, and use these copies for scenarios like development and test. This planned acquisition further demonstrates Google Cloud’s commitment to helping enterprises protect workloads on-premises and in the cloud. Learn more.
      • Traffic Director can now send traffic to services and gateways hosted outside of Google Cloud—Traffic Director support for Hybrid Connectivity Network Endpoint Groups (NEGs), now generally available, enables services in your VPC network to interoperate more seamlessly with services in other environments. It also enables you to build advanced solutions based on Google Cloud's portfolio of networking products, such as Cloud Armor protection for your private on-prem services. Learn more.
      • Google Cloud launches the Healthcare Interoperability Readiness Program—This program, powered by APIs and Google Cloud’s Apigee, helps patients, doctors, researchers, and healthcare technologists alike by making patient data and healthcare data more accessible and secure. Learn more here.
      • Container Threat Detection in Security Command Center—We announced the general availability of Container Threat Detection, a built-in service in Security Command Center. This release includes multiple detection capabilities to help you monitor and secure your container deployments in Google Cloud. Read more here.
      • Anthos on bare metal now GA—Anthos on bare metal opens up new possibilities for how you run your workloads, and where. You can run Anthos on your existing virtualized infrastructure, or eliminate the dependency on a hypervisor layer to modernize applications while reducing costs. Learn more.


      Week of Nov 23-27, 2020

      • Tuning control support in Cloud SQL for MySQL—We’ve made all 80 flags that were previously in preview now generally available (GA), empowering you with the controls you need to optimize your databases. See the full list here.
      • New in BigQuery ML—We announced the general availability of boosted trees using XGBoost, deep neural networks (DNNs) using TensorFlow, and model export for online prediction. Learn more.
      • New AI/ML in retail report—We recently commissioned a survey of global retail executives to better understand which AI/ML use cases across the retail value chain drive the highest value and returns in retail, and what retailers need to keep in mind when going after these opportunities. Learn more  or read the report.


      Week of Nov 16-20, 2020

      • New whitepaper on how AI helps the patent industry—Our new paper outlines a methodology to train a BERT (bidirectional encoder representation from transformers) model on over 100 million patent publications from the U.S. and other countries using open-source tooling. Learn more or read the whitepaper.
      • Google Cloud support for .NET 5.0—Learn more about our support of .NET 5.0, as well as how to deploy it to Cloud Run.
      • .NET Core 3.1 now on Cloud Functions—With this integration you can write cloud functions using your favorite .NET Core 3.1 runtime with our Functions Framework for .NET for an idiomatic developer experience. Learn more.
      • Filestore Backups in preview—We announced the availability of the Filestore Backups preview in all regions, making it easier to migrate your business continuity, disaster recovery and backup strategy for your file systems in Google Cloud. Learn more.
      • Introducing Voucher, a service to help secure the container supply chain—Developed by the Software Supply Chain Security team at Shopify to work with Google Cloud tools, Voucher evaluates container images created by CI/CD pipelines and signs those images if they meet certain predefined security criteria. Binary Authorization then validates these signatures at deploy time, ensuring that only explicitly authorized code that meets your organizational policy and compliance requirements can be deployed to production. Learn more.
      • 10 most watched from Google Cloud Next ‘20: OnAir—Take a stroll through the 10 sessions that were most popular from Next OnAir, covering everything from data analytics to cloud migration to no-code development. Read the blog.
      • Artifact Registry is now GA—With support for container images, Maven, npm packages, and additional formats coming soon, Artifact Registry helps your organization benefit from scale, security, and standardization across your software supply chain. Read the blog.


      Week of Nov 9-13, 2020

      • Introducing the Anthos Developer Sandbox—The Anthos Developer Sandbox gives you an easy way to learn to develop on Anthos at no cost, available to anyone with a Google account. Read the blog.
      • Database Migration Service now available in preview—Database Migration Service (DMS) makes migrations to Cloud SQL simple and reliable. DMS supports migrations of self-hosted MySQL databases—either on-premises or in the cloud, as well as managed databases from other clouds—to Cloud SQL for MySQL. Support for PostgreSQL is currently available for limited customers in preview, with SQL Server coming soon. Learn more.
      • Troubleshoot deployments or production issues more quickly with new logs tailing—We’ve added support for a new API to tail logs with low latency. Using gcloud, it allows you the convenience of tail -f with the powerful query language and centralized logging solution of Cloud Logging. Learn more about this preview feature.
      • Regionalized log storage now available in 5 new regions in preview—You can now select where your logs are stored from one of five regions in addition to global—asia-east1, europe-west1, us-central1, us-east1, and us-west1. When you create a logs bucket, you can set the region in which you want to store your logs data. Get started with this guide.


      Week of Nov 2-6, 2020

      • Cloud SQL adds support for PostgreSQL 13—Shortly after its community GA, Cloud SQL has added support for PostgreSQL 13. You get access to the latest features of PostgreSQL while Cloud SQL handles the heavy operational lifting, so your team can focus on accelerating application delivery. Read more here.
      • Apigee creates value for businesses running on SAP—Google Cloud’s API Management platform Apigee is optimized for data insights and data monetization, helping businesses running on SAP innovate faster without fear of SAP-specific challenges to modernization. Read more here.
      • Document AI platform is live—The new Document AI (DocAI) platform, a unified console for document processing, is now available in preview. You can quickly access all parsers, tools and solutions (e.g. Lending DocAI, Procurement DocAI) with a unified API, enabling an end-to-end document solution from evaluation to deployment. Read the full story here or check it out in your Google Cloudconsole.
      • Accelerating data migration with Transfer Appliances TA40 and TA300—We’re announcing the general availability of new Transfer Appliances. Customers are looking for fast, secure and easy to use options to migrate their workloads to Google Cloud and we are addressing their needs with next generation Transfer Appliances. Learn more about Transfer Appliances TA40 and TA300.


      Week of Oct 26-30, 2020

      • B.H., Inc. accelerates digital transformation—The Utah based contracting and construction company BHI eliminated IT backlog when non technical employees were empowered to build equipment inspection, productivity, and other custom apps by choosing Google Workspace and the no-code app development platform, AppSheet. Read the full story here.
      • Globe Telecom embraces no-code development—Google Cloud’s AppSheet empowers Globe Telecom employees to do more innovating with less code. The global communications company kickstarted their no-code journey by combining the power of AppSheet with a unique adoption strategy. As a result, AppSheet helped Globe Telecom employees build 59 business apps in just 8 weeks. Get the full story.
      • Cloud Logging now allows you to control access to logs via Log Views—Building on the control offered via Log Buckets (blog post), you can now configure who has access to logs based on the source project, resource type, or log name, all using standard IAM controls. Logs views, currently in Preview, can help you build a system using the principle of least privilege, limiting sensitive logs to only users who need this information. Learn more about Log Views.
      • Document AI is HIPAA compliantDocument AI now enables HIPAA compliance. Now Healthcare and Life Science customers such as health care providers, health plans, and life science organizations can unlock insights by quickly extracting structured data from medical documents while safeguarding individuals’ protected health information (PHI). Learn more about Google Cloud’s nearly 100 products that support HIPAA-compliance.


      Week of Oct 19-23, 2020

      • Improved security and governance in Cloud SQL for PostgreSQL—Cloud SQL for PostgreSQL now integrates with Cloud IAM (preview) to provide simplified and consistent authentication and authorization. Cloud SQL has also enabled PostgreSQL Audit Extension (preview) for more granular audit logging. Read the blog.
      • Announcing the AI in Financial Crime Compliance webinar—Our executive digital forum will feature industry executives, academics, and former regulators who will discuss how AI is transforming financial crime compliance on November 17. Register now.
      • Transforming retail with AI/ML—New research provides insights on high value AI/ML use cases for food, drug, mass merchant and speciality retail that can drive significant value and build resilience for your business. Learn what the top use cases are for your sub-segment and read real world success stories. Download the ebook here and view this companion webinar which also features insights from Zulily.
      • New release of Migrate for Anthos—We’re introducing two important new capabilities in the 1.5 release of Migrate for Anthos, Google Cloud's solution to easily migrate and modernize applications currently running on VMs so that they instead run on containers in Google Kubernetes Engine or Anthos. The first is GA support for modernizing IIS apps running on Windows Server VMs. The second is a new utility that helps you identify which VMs in your existing environment are the best targets for modernization to containers. Start migrating or check out the assessment tool documentation (Linux | Windows).
      • New Compute Engine autoscaler controls—New scale-in controls in Compute Engine let you limit the VM deletion rate by preventing the autoscaler from reducing a MIG's size by more VM instances than your workload can tolerate to lose. Read the blog.
      • Lending DocAI in previewLending DocAI is a specialized solution in our Document AI portfolio for the mortgage industry that processes borrowers’ income and asset documents to speed-up loan applications. Read the blog, or check out the product demo.


      Week of Oct 12-16, 2020

      • New maintenance controls for Cloud SQL—Cloud SQL now offers maintenance deny period controls, which allow you to prevent automatic maintenance from occurring during a 90-day time period. Read the blog.
      • Trends in volumetric DDoS attacks—This week we published a deep dive into DDoS threats, detailing the trends we’re seeing and giving you a closer look at how we prepare for multi-terabit attacks so your sites stay up and running. Read the blog.
      • New in BigQuery—We shared a number of updates this week, including new SQL capabilities, more granular control over your partitions with time unit partitioning, the general availability of Table ACLs, and BigQuery System Tables Reports, a solution that aims to help you monitor BigQuery flat-rate slot and reservation utilization by leveraging BigQuery’s underlying INFORMATION_SCHEMA views. Read the blog.
      • Cloud Code makes YAML easy for hundreds of popular Kubernetes CRDs—We announced authoring support for more than 400 popular Kubernetes CRDs out of the box, any existing CRDs in your Kubernetes cluster, and any CRDs you add from your local machine or a URL. Read the blog.
      • Google Cloud’s data privacy commitments for the AI era—We’ve outlined how our AI/ML Privacy Commitment reflects our belief that customers should have both the highest level of security and the highest level of control over data stored in the cloud. Read the blog.

      • New, lower pricing for Cloud CDN—We’ve reduced the price of cache fill (content fetched from your origin) charges across the board, by up to 80%, along with our recent introduction of a new set of flexible caching capabilities, to make it even easier to use Cloud CDN to optimize the performance of your applications. Read the blog.

      • Expanding the BeyondCorp Alliance—Last year, we announced our BeyondCorp Alliance with partners that share our Zero Trust vision. Today, we’re announcing new partners to this alliance. Read the blog.

      • New data analytics training opportunities—Throughout October and November, we’re offering a number of no-cost ways to learn data analytics, with trainings for beginners to advanced users. Learn more.

      • New BigQuery blog series—BigQuery Explained provides overviews on storage, data ingestion, queries, joins, and more. Read the series.


      Week of Oct 5-9, 2020

      • Introducing the Google Cloud Healthcare Consent Management API—This API gives healthcare application developers and clinical researchers a simple way to manage individuals’ consent of their health data, particularly important given the new and emerging virtual care and research scenarios related to COVID-19. Read the blog.

      • Announcing Google Cloud buildpacks—Based on the CNCF buildpacks v3 specification, these buildpacks produce container images that follow best practices and are suitable for running on all of our container platforms: Cloud Run (fully managed), Anthos, and Google Kubernetes Engine (GKE). Read the blog.

      • Providing open access to the Genome Aggregation Database (gnomAD)—Our collaboration with Broad Institute of MIT and Harvard provides free access to one of the world's most comprehensive public genomic datasets. Read the blog.

      • Introducing HTTP/gRPC server streaming for Cloud Run—Server-side HTTP streaming for your serverless applications running on Cloud Run (fully managed) is now available. This means your Cloud Run services can serve larger responses or stream partial responses to clients during the span of a single request, enabling quicker server response times for your applications. Read the blog.

      • New security and privacy features in Google Workspace—Alongside the announcement of Google Workspace we also shared more information on new security features that help facilitate safe communication and give admins increased visibility and control for their organizations. Read the blog.

      • Introducing Google Workspace—Google Workspace includes all of the productivity apps you know and use at home, at work, or in the classroom—Gmail, Calendar, Drive, Docs, Sheets, Slides, Meet, Chat and more—now more thoughtfully connected. Read the blog.

      • New in Cloud Functions: languages, availability, portability, and more—We extended Cloud Functions—our scalable pay-as-you-go Functions-as-a-Service (FaaS) platform that runs your code with zero server management—so you can now use it to build end-to-end solutions for several key use cases. Read the blog.

      • Announcing the Google Cloud Public Sector Summit, Dec 8-9—Our upcoming two-day virtual event will offer thought-provoking panels, keynotes, customer stories and more on the future of digital service in the public sector. Register at no cost.

    • AlloyDB for PostgreSQL under the hood: Intelligent, database-aware storage Wed, 11 May 2022 19:15:00 -0000

      Today, at Google I/O, we announced AlloyDB for PostgreSQL, a fully-managed, PostgreSQL-compatible database for demanding, enterprise-grade transactional and analytical workloads. Imagine PostgreSQL plus the best of the cloud: elastic storage and compute, intelligent caching, and AI/ML-powered management. Further, AlloyDB delivers unmatched price-performance: In our performance tests, it’s more than 4x faster on transactional workloads, and up to 100x faster analytical queries than standard PostgreSQL, all with simple, predictable pricing. Designed for mission-critical applications, AlloyDB offers extensive data protection and an industry leading 99.99% availability. 

      Multiple innovations underpin the performance, and availability gains of AlloyDB for PostgreSQL. In the first part of our “AlloyDB for PostgreSQL under the hood” series, we are covering the intelligent, database-aware, horizontally scalable storage layer that’s optimized for PostgreSQL. 

      Disaggregation of compute and storage

      AlloyDB for PostgreSQL was built on the fundamental principle of disaggregation of compute and storage, and is designed to leverage disaggregation at every layer of the stack. 

      AlloyDB begins by separating the database layer from storage, introducing a new intelligent storage service, optimized for PostgreSQL. This reduces I/O bottlenecks, and allows AlloyDB to offload many database operations to the storage layer through the use of a log-processing system. The storage service itself also disaggregates compute and storage, allowing block storage to scale separately from log processing.

      1 Evolution of disaggregation of compute and storage.jpg
      Figure 1: Evolution of disaggregation of compute and storage

      Disaggregation of compute and storage within databases has evolved over the years. In early approaches, though storage could be sized independently from the compute layer, the overall system was still fairly static and lacked elasticity. By building on cloud-scale storage solutions, next-generation database systems improved on storage elasticity, but still suffered from either oversized storage clusters or a lack of IO capacity in case of workload spikes (hotspots).
      With AlloyDB, the fully disaggregated architecture even at the storage layer allows it to work as an elastic, distributed cluster that can dynamically adapt to changing workloads, adds failure tolerance, increases availability, and enables cost-efficient read pools that scale read throughput horizontally. Multiple layers of caching throughout the stack, which are automatically tiered based on workload patterns, give developers increased performance while retaining the scale, economics, and availability of cloud-native storage. Combined, these aspects of AlloyDB architecture mark the next step of the evolution of disaggregation in databases and contribute to AlloyDB’s exceptional performance and availability.

      The trouble with monolithic design

      Traditionally, PostgreSQL databases employ a monolithic design, co-locating storage and compute resources in a single machine. In the event that you need more storage capacity or compute performance, you scale the system up by moving to a more powerful server, or adding more disks. When scaling up further is no longer possible or cost-effective (more powerful servers get increasingly expensive), you can use replication to create multiple, read-only copies of the database. 

      That approach has its limitations: Failover times are longer and less predictable, as they are dependent on database load and configuration. Moreover, read replicas have their own, lagging, and expensive copy of the database, making it more difficult to scale read capacity and manage replica lag. As a result, the elasticity of a monolithic database with tightly coupled storage and compute is limited. By disaggregating compute and storage, AlloyDB is able to overcome many of these limitations.

      To further increase the scalability of the database layer beyond the capacity of a single (virtual) machine, AlloyDB allows you to add multiple read-only replica instances to support the primary database instance in read-only query processing without requiring additional database copies: Since the storage layer is distributed across zones and accessible from any server, you can quickly build read replica instances that are inexpensive (since each replica instance doesn’t require its own storage) and up-to-date. Fundamentally, these design principles allow us to create a platform that moves functionality out of the primary database instance monolith, and convert it into a cloud-native implementation that provides better performance, scalability, availability, and manageability.

      AlloyDB design overview

      The AlloyDB storage layer is a distributed system comprised of three major parts: 

      1. A low-latency, regional log storage service for very fast write-ahead log (WAL) writing

      2. A log processing service (LPS) that processes these WAL records and produces “materialized” database blocks 

      3. Failure-tolerant, sharded, regional block storage that ensures durability even in case of zonal storage failures.

      Figure 2 below shows a high-level conceptual overview of the log processing service and its integration with the PostgreSQL database layer and durable storage. The primary database instance (of which there is only one) persists WAL log entries, reflecting database modification operations (such as INSERT/DELETE/UPDATE) to the low-latency regional log store. From there, the log processing service (LPS) consumes these WAL records for further processing. Because the log processing service is fully aware of the semantics of the PostgreSQL WAL records and the PostgreSQL storage format, it can continuously replay the modification operations described by these WAL records and materialize up-to-data database blocks to a sharded, regional storage system. From there, these blocks can then either be served back to the primary database instance (in the case of a restart or simply when a block falls out of cache) or to any number of replica instances that might be in any of the zones within the region where the storage service operates.

      To keep the local caches of the replica instances up-to-date, AlloyDB also streams WAL records from the primary to the replica instances to notify them about recent changes. Without this information about changing blocks, cached blocks in the replica instances could become arbitrarily stale.

      2 Overview of PostgreSQL as integrated with the storage service.jpg
      Figure 2: Overview of PostgreSQL as integrated with the storage service.

      What are the key benefits of this approach? Let's consider some of the implications of this design in further detail:

      • Full compute/storage disaggregation even within the storage layer. LPS can scale out based on workload patterns, and transparently add more compute resources to process logs when needed to avoid hotspots. Since the log processors are purely compute-attached to a shared regional storage, they can flexibly scale out/in without needing to copy any data.

      • Storage-layer replication: By synchronously replicating all blocks across multiple zones, the storage layer automatically protects the system from zonal failures without any impact on or modifications to the database layer. 

      • Efficient IO paths / no full-page writes: For update operations, the compute layer only communicates the WAL records to the storage layer, which is continuously replaying them. In this design, there is no need to checkpoint the database layer, or any reason to send complete database blocks to the storage layer (e.g., to safeguard against thetorn pages problem). This allows the database layer to focus on query processing tasks, and allows the network between the database and storage layer to be used efficiently.

      • Low-latency WAL writing: The use of low-latency, regional log storage allows the system to quickly flush WAL log records in case of a transaction commit operation. As a result, transaction commit is a very fast operation and the system achieves high transaction throughput even in times of peak load.

      • Fast creation of read replica instances: Since the storage service can serve any block in any zone, any number of read replica instances from the database layer can attach to the storage service and process queries without needing a “private” copy of the database. The creation of a read replica instance is very fast as data can be incrementally loaded from the storage layer on demand — there’s no need to stream a complete copy of the database to a replica instance before starting query processing.

      • Fast restart recovery: Since the log processing service continuously replays WAL log records during online operations, the amount of write-ahead log that needs to be processed during restart recovery is minimal. As a consequence, system restarts are accelerated significantly (because WAL-related recovery work is kept to a minimum).

      • Storage-layer backups: Backup operations are completely handled by the storage service, and do not impact the performance and resources of the database layer.

      Life of a write operation

      Let’s further explore the design of the system by tracing the path of a database modification operation (Figure 3). It begins with the client that issues, for example, a SQL INSERT statement over the client’s TCP connection to the primary instance on the database layer. The primary instance processes the statement (updating its data and index structures in-memory) and prepares a WAL log record that captures the semantics of the update operation. Upon transaction commit, this log record is first synchronously saved to the low-latency regional log storage, and then asynchronously picked up by the log processing service in the next step.

      Note that the storage layer is intentionally decomposed into separate components, optimizing for the separate tasks performed by the storage layer — log storage, log processing, and block storage. To reduce transaction commit latency, it is important to durably store the log records as fast as possible and achieve transaction durability. Because WAL log writing is an append-only operation, AlloyDB specifically optimizes for this use case with a high-performance, low-latency storage solution. In the second phase, WAL log records need to be processed by applying them to the previous version of the block they refer to. To do this, the storage layer’s LPS subsystem performs random block lookups and applies PostgreSQL’s redo-processing logic in a high performance and scalable way.

      To ensure regional durability for the materialized blocks, multiple instances of the log processing service (LPS) run in each of the zones of the region. Every log record has to be processed and the resulting buffers need to be durably stored in a sharded, regional block storage (see below) to eventually remove the log record from regional log storage.

      3 Processing of a write operation in AlloyDB.jpg
      Figure 3: Processing of a write operation in AlloyDB

      Life of a read operation

      Similarly, reads begin with a SQL query that’s sent to a database server; this can either be the primary instance or one of the (potentially many) replica instances used for read-only query processing (both of these paths are visualized in Figure 4). The database server performs the same query-parsing, planning and processing as a conventional PostgreSQL system. If all the required blocks are present in its memory-resident buffer cache, there’s no need for the database to interact with the storage layer at all. To allow for very fast query processing,  even in cases where the working set does not fit into the buffer cache, AlloyDB integrates an ultra-fast block cache directly into the database layer. This cache extends the capacity of the buffer cache significantly, thereby further accelerating the system in those cases.

      However, if a block is missing in both of the caches, a corresponding block fetch request is sent to the storage layer. Apart from the block number to retrieve, this request specifies a log-sequence number (LSN) at which to read the data. The use of a specific LSN here ensures that the database server always sees a consistent state during query processing. This is particularly important when evicting blocks from the PostgreSQL buffer cache and subsequently reloading them, or when traversing complex, multi-block index structures like B-trees that might be (structurally) modified concurrently.

      4 Processing of a read operation in AlloyDB.jpg
      Figure 4: Processing of a read operation in AlloyDB

      On the storage layer, the log processing service is also responsible for serving the block fetch requests. Every LPS has its own instance of the PostgreSQL buffer cache — if the requested block is already in the LPS buffer cache, it can be returned to the database layer immediately without any I/O operation. If the requested block is not present in the cache, the LPS retrieves the block from the sharded, regional storage and sends it back to the database layer. The log processing service must also do some bookkeeping to track which blocks have log records that have not been processed. When a request for such a block arrives (an event we expect to be rare since the database layer only requests blocks that have been evicted from cache and then referenced), the read request must be stalled until redo processing for that log record has been completed. Consequently, to avoid such stalls, it is very important that WAL processing on the LPS layer is efficient and scalable, so it can handle even the most  demanding enterprise workloads. We discuss this in more detail in the next section.

      Storage layer elasticity

      So far, we’ve discussed the log processing service as a single process (in each zone). For demanding enterprise workloads though, having only a single LPS process potentially creates a scalability problem, as the LPS both needs to continuously apply WAL records as well as serve read requests from both primary and multiple replica instances. 

      To address this problem, database persistence is horizontally partitioned into groups of blocks called shards. Both shards and LPS resources scale horizontally and independently.

      Every shard is assigned to exactly one LPS at any time, but each LPS can handle multiple shards. The shard-to-LPS mapping is dynamic, allowing the storage layer to elastically respond to increased access patterns by scaling the number of LPS resources, and reassigning shards. This not only allows the storage layer to scale throughput, but to also avoid hot spots.
      Let's consider two examples here: In the first case, the overall system load increases, and virtually all shards receive more requests than before. In this case, the storage layer can increase the number of LPS instances, e.g. by doubling them. The newly created log processing server instances then offload the existing instances by taking over some of their shards. As this shard re-assignment does not involve any data copying or other expensive operations, it is extremely fast and invisible to the database layer.

      Another example where shard reassignment is very helpful is a case where a small set of shards suddenly become very popular in the system (e.g. information about a certain product family stored in the database is requested frequently after a super-bowl commercial). Again, the storage layer can dynamically react - in the most extreme case by assigning each of the shards observing the workload spike to a dedicated LPS instance that is exclusively handling the shard's load. Consequently, with re-sharding and LPS elasticity in place, the system can provide high performance and throughput even in case of workload spikes, and also reduces its resource footprint if the workload reduces again. For both the database layer and the end user, this dynamic resizing and storage layer elasticity is completely automatic and requires no user actions.

      5 Dynamic mapping of shards to LPS instances.jpg
      Figure 5: Dynamic mapping of shards to LPS instances allows for load balancing and LPS elasticity

      Storage layer replication and recovery

      The goal of AlloyDB is to provide data durability and high system availability even in case of zonal failures, e.g., in case of a power outage or a fire in a data center. To this end, the storage layer of every AlloyDB instance is distributed across three zones. Each zone has a complete copy of the database state, which is continuously updated by applying WAL records from the low-latency, regional log storage system discussed above.

      6 The fully replicated, fully sharded, multi-zone database state.jpg
      Figure 6: The fully replicated, fully sharded, multi-zone database state

      Figure 6 shows the full system across three zones with multiple log-processing servers (LPS) and potentially multiple shards per server. Note that a copy of each shard is available in each zone.

      With this architecture, block lookup operations can be performed with minimal overhead. Each zone has its own copy of the complete database state, so block lookup operations by the database layer don't need to cross zonal boundaries. Moreover, the storage layer continuously applies WAL records in all zones and the database layer provides the target version LSN for each block it requests (see above), so there is no need to establish a read quorum during lookup operations. 

      In the event that an entire zone becomes unavailable, the storage layer can replace the failed zone by integrating a new zone from the same region, and populating it with a copy of the complete database state. As visualized in Figure 6, this is done by making sure that a copy of each shard is available in the new zone, and by running the log processing service to continuously update the shards with the latest WAL records. As such, the storage layer internally handles all zone failovers without any orchestration or auxiliary activities of the database layer.

      Besides these built-in capabilities of the storage layer, AlloyDB also integrates both manual and automatically scheduled backup operations to safeguard against application-level or operator failures (like accidentally dropping a table).

      What AlloyDB’s intelligent storage can do for you

      To summarize, AlloyDB for PostgreSQL disaggregates the compute and storage layers of the database, and offloads many database operations to the storage layer through the use of the log processing system. The fully disaggregated architecture even at the storage layer allows it to work as an elastic, distributed cluster that can dynamically adapt to changing workloads, adds failure tolerance, increases availability, and enables cost efficient read pools that scale read throughput linearly. Offloading also allows for much higher write throughput for the primary instance as it can fully focus on query processing and delegate maintenance tasks to the storage layer. Combined, these aspects of AlloyDB’s intelligent, database-aware storage layer contribute to the exceptional performance and availability of AlloyDB. 

      To try AlloyDB out for yourself, visit cloud.google.com/alloydb. And be sure to stay tuned for our next post on AlloyDB’s Columnar Engine. 


      The AlloyDB technical innovations described in this and subsequent posts would not have been possible without the exceptional contributions of our engineering team.

      Related Article

      Introducing AlloyDB for PostgreSQL: Free yourself from expensive, legacy databases

      AlloyDB for PostgreSQL combines the best of Google with full PostgreSQL compatibility to achieve superior performance, availability, and ...

      Read Article
    • Introducing AlloyDB for PostgreSQL: Free yourself from expensive, legacy databases Wed, 11 May 2022 19:15:00 -0000

      Enterprises are struggling to free themselves from legacy database systems, and need an alternative option to modernize their applications. Today at Google I/O, we’re thrilled to announce the preview of AlloyDB for PostgreSQL, a fully-managed, PostgreSQL-compatible database service that provides a powerful option for modernizing your most demanding enterprise database workloads. 

      Compared with standard PostgreSQL, in our performance tests, AlloyDB was more than four times faster for transactional workloads, and up to 100 times faster for analytical queries. AlloyDB was also two times faster for transactional workloads than Amazon’s comparable service. This makes AlloyDB a powerful new modernization option for transitioning off of legacy databases.

      As organizations modernize their database estates in the cloud, many struggle to eliminate their dependency on legacy database engines. In particular, enterprise customers are looking to standardize on open systems such as PostgreSQL to eliminate expensive, unfriendly licensing and the vendor lock-in that comes with legacy products. However, running and replatforming business-critical workloads onto an open source database can be daunting: teams often struggle with performance tuning, disruptions caused by vacuuming, and managing application availability. AlloyDB combines the best of Google’s scale-out compute and storage, industry-leading availability, security, and AI/ML-powered management with full PostgreSQL compatibility, paired with the performance, scalability, manageability, and reliability benefits that enterprises expect to run their mission-critical applications.

      As noted by Carl Olofson, Research Vice President, Data Management Software, IDC, “databases are increasingly shifting into the cloud and we expect this trend to continue as more companies digitally transform their businesses. With AlloyDB, Google Cloud offers large enterprises a big leap forward, helping companies to have all the advantages of PostgreSQL, with the promise of improved speed and functionality, and predictable and transparent  pricing.”

      AlloyDB is the next major milestone in our journey to support customers' heterogeneous migrations. For example, we recently added Oracle-to-PostgreSQL schema conversion and data replication capabilities to our Database Migration Service, while our new Database Migration Program helps you accelerate your move to the cloud with tooling and incentive funding. 

      “Developers have many choices for building, innovating and migrating their applications. AlloyDB provides us with a compelling relational database option with full PostgreSQL compatibility, great performance, availability and cloud integration. We are really excited to co-innovate with Google and can now benefit from enterprise grade features while cost-effectively modernizing from legacy, proprietary databases."—Bala Natarajan, Sr. Director, Data Infrastructure and Cloud Engineering, PayPal 

      Let’s dive into what makes AlloyDB unique

      With AlloyDB, we’re tapping into decades of experience designing and managing some of the world’s most scalable and available database services, bringing the best of Google to the PostgreSQL ecosystem. 

      At AlloyDB’s core is an intelligent, database-optimized storage service built specifically for PostgreSQL. AlloyDB disaggregates compute and storage at every layer of the stack, using the same infrastructure building blocks that power large-scale Google services such as YouTube, Search, Maps, and Gmail. This unique technology allows it to scale seamlessly while offering predictable performance.

      Additional investments in analytical acceleration, embedded AI/ML, and automatic tiering of data means that AlloyDB is ready to handle any workload you throw at it, with minimal management overhead.

      Finally, we do all this while maintaining full compatibility with PostgreSQL 14, the latest version of the advanced open source database, so you can reuse your existing development skills and tools, and migrate your existing PostgreSQL applications with no code changes, benefitting from the entire PostgreSQL ecosystem. Furthermore, by using PostgreSQL as the foundation of AlloyDB, we’re continuing our commitment to openness while delivering differentiated value to our customers.

      “We have been so delighted to try out the new AlloyDB for PostgreSQL service. With AlloyDB, we have significantly increased throughput, with no application changes to our PostgreSQL workloads. And since it's a managed service, our teams can spend less time on database operations, and more time on value added tasks.”—Sofian Hadiwijaya, CTO and Co-Founder, Warung Pintar

      With AlloyDB you can modernize your existing applications with:

      1. Superior performance and scale
      AlloyDB delivers superior performance and scale for your most demanding commercial-grade workloads. AlloyDB is four times faster than standard PostgreSQL and two times faster than Amazon’s comparable PostgreSQL-compatible service for transactional workloads. Multiple layers of caching, automatically tiered based on workload patterns, provide customers best-in-class price/performance.

      2. Industry-leading availability
      AlloyDB provides a high-availability SLA of 99.99% inclusive of maintenance. AlloyDB automatically detects and recovers from most database failures within seconds, independent of database size and load. AlloyDB’s architecture also supports non-disruptive instance resizing and database maintenance. The primary instance can resume normal operations in seconds, while replica pool updates are fully transparent to users. This ensures that customers have a highly reliable, continuously available database for their mission-critical workloads.

      “We are excited about the new PostgreSQL-compatible database. AlloyDB will bring more scalability and availability with no application changes. As we run our e-commerce platform and its availability is important, we are specially expecting AlloyDB to minimize the maintenance downtime.”—Ryuzo Yamamoto, Software Engineer, Mercari (​​Souzoh, Inc.)

      3. Real-time business insights 
      AlloyDB delivers up to 100 times faster analytical queries than standard PostgreSQL. This is enabled by a vectorized columnar accelerator that stores data in memory in an optimized columnar format for faster scans and aggregations. This makes AlloyDB a great fit for business intelligence, reporting, and hybrid transactional and analytical workloads (HTAP). And even better, the accelerator is auto-populated, so you can improve analytical performance with the click of a button. 

      “At PLAID, we are developing KARTE, a customer experience platform. It provides advanced real-time analytics capabilities for vast amounts of behavioral data to discover deep insights and create an environment for communicating with customers. AlloyDB is fully compatible with PostgreSQL and can transparently extend column-oriented processing. We think it's a new powerful option with a unique technical approach that enables system designs to integrate isolated OLTP, OLAP, and HTAP workloads with minimal investment in new expertise. We look forward to bringing more performance, scalability, and extensibility to our analytics capabilities by enhancing data integration with Google Cloud's other powerful database services in the future.”—Takuya Ogawa, Lead Product Engineer, PLAID

      4. Predictable, transparent pricing
      AlloyDB makes keeping costs in check easier than ever. Pricing is transparent and predictable, with no expensive, proprietary licensing and no opaque I/O charges. Storage is automatically provisioned and customers are only charged for what they use, with no additional storage costs for read replicas. A free ultra-fast cache, automatically provisioned in addition to instance memory, allows you to maximize price/performance.

      5. ML-assisted management and insights 
      Like many managed database services, AlloyDB automatically handles database patching, backups, scaling and replication for you. But it goes several steps further by using adaptive algorithms and machine learning for PostgreSQL vacuum management, storage and memory management, data tiering, and analytics acceleration. It learns about your workload and intelligently organizes your data across memory, an ultra-fast secondary cache, and durable storage. These automated capabilities simplify management for DBAs and developers. AlloyDB also empowers customers to better leverage machine learning in their applications. Built-in integration with Vertex AI, Google Cloud’s artificial intelligence platform, allows users to call models directly within a query or transaction. That means high throughput, low-latency, and augmented insights, without having to write any additional application code.

      Get started with AlloyDB

      A modern database strategy plays a critical role in developing great applications faster and delivering new experiences to your customers. The AlloyDB launch is an exciting milestone for Google Cloud databases, and we’re thrilled to see how you use it to drive innovation across your organization and regain control and freedom of your database workloads.

      In this video, Thomas Kurian, CEO of Google Cloud introduces the new fully-managed, relational database service from Google Cloud that’s fully compatible with PostgreSQL. AlloyDB for PostgreSQL combines the best of Google’s scale-out compute and storage, industry-leading availability, and AI/ML-powered management with full PostgreSQL compatibility. Our performance tests show that AlloyDB offers superior performance, as well as, scalability, manageability, and reliability benefits that enterprises expect to run their mission-critical applications. AlloyDB provides organizations a powerful option for transitioning off of legacy databases.

      To learn more about the technology innovations behind AlloyDB, check out this deep dive into its intelligent storage system. Then, visit cloud.google.com/alloydb to get started and create your first cluster. You can also review the demos and launch announcements from Google I/O 2022.

      Related Article

      AlloyDB for PostgreSQL under the hood: Intelligent, database-aware storage

      In this technical deep dive, we take a look at the intelligent, scalable storage system that powers AlloyDB for PostgreSQL.

      Read Article
    • Google Cloud unveils world’s largest publicly available ML hub with Cloud TPU v4, 90% carbon-free energy Wed, 11 May 2022 19:15:00 -0000

      At Google, the state-of-the-art capabilities you see in our products such as Search and YouTube are made possible by Tensor Processing Units (TPUs), our custom machine learning (ML) accelerators. We offer these accelerators to Google Cloud customers as Cloud TPUs. Customer demand for ML capacity, performance, and scale continues to increase at an unprecedented rate. To support the next generation of fundamental advances in artificial intelligence (AI), today we announced Google Cloud's machine learning cluster with Cloud TPU v4 Pods in Preview — one of the fastest, most efficient, and most sustainable ML infrastructure hubs in the world.

      Powered by Cloud TPU v4 Pods, Google Cloud’s ML cluster enables researchers and developers to make breakthroughs at the forefront of AI, allowing them to train increasingly sophisticated models to power workloads such as large-scale natural language processing (NLP), recommendation systems, and computer vision algorithms. At 9 exaflops of peak aggregate performance, we believe our cluster of Cloud TPU v4 Pods is the world's largest publicly available ML hub in terms of cumulative computing power, while operating at 90% carbon-free energy.  

      "Based on our recent survey of 2000 IT decision makers, we found that inadequate infrastructure capabilities are often the underlying cause of AI projects failing. To address the growing importance for purpose-built AI infrastructure for enterprises, Google launched its new machine learning cluster in Oklahoma with nine exaflops of aggregated compute. We believe that this is the largest publicly available ML hub with 90% of the operation reported to be powered by carbon free energy. This demonstrates Google's ongoing commitment to innovating in AI infrastructure with sustainability in mind." —Matt Eastwood, Senior Vice President, Research, IDC

      Pushing the boundaries of what’s possible

      Building on the announcement of Cloud TPU v4 at Google I/O 2021, we granted early access to Cloud TPU v4 Pods to several top AI research teams, including Cohere, LG AI Research, Meta AI, and Salesforce Research. Researchers liked the performance and scalability that TPU v4 provides with its fast interconnect and optimized software stack, the ability to set up their own interactive development environment with our new TPU VM architecture, and the flexibility to use their preferred frameworks, including JAX, PyTorch, or TensorFlow. These characteristics allow researchers to push the boundaries of AI, training large-scale, state-of-the-art ML models with high price-performance and carbon efficiency.

      In addition, TPU v4 has enabled breakthroughs at Google Research in the areas of language understanding, computer vision, speech recognition, and much more, including the recently announced Pathways Language Model (PaLM) trained across two TPU v4 Pods.

      “In order to make advanced AI hardware more accessible, a few years ago we launched theTPU Research Cloud (TRC) program that has provided access at no charge to TPUs to thousands of ML enthusiasts around the world. They have published hundreds of papers and open-source github libraries on topics ranging from ‘Writing Persian poetry with AI' to ‘Discriminating between sleep and exercise-induced fatigue using computer vision and behavioral genetics’. The Cloud TPU v4 launch is a major milestone for both Google Research and our TRC program, and we are very excited about our long-term collaboration with ML developers around the world to use AI for good.”—Jeff Dean, SVP, Google Research and AI

      Sustainable ML breakthroughs

      The fact that this research is powered predominantly by carbon-free energy makes the Google Cloud ML cluster all the more remarkable. As part of Google’s commitment to sustainability, we’ve been matching 100% of our data centers’ and cloud regions’ annual energy consumption with renewable energy purchases since 2017. By 2030, our goal is to run our entire business on carbon-free energy (CFE) every hour of every day. Google’s Oklahoma data center, where the ML cluster is located, is well on its way to achieving this goal, operating at 90% carbon-free energy on an hourly basis within the same grid. 

      In addition to the direct clean energy supply, the data center has a Power Usage Efficiency (PUE)1 rating of 1.10, making it one of the most energy-efficient data centers in the world. Finally, the TPU v4 chip itself is highly energy efficient, with about 3x the peak FLOPs per watt of max power of TPU v3. With energy-efficient ML-specific hardware, in a highly efficient data center, supplied by exceptionally clean power, Cloud TPU v4 provides three key best practices that can help significantly reduce energy use and carbon emissions.

      Breathtaking scale and price-performance

      In addition to sustainability, in our work with leading ML teams we have observed two other pain points: scale and price-performance. Our ML cluster in Oklahoma offers the capacity that researchers need to train their models, at compelling price-performance, on the cleanest cloud in the industry. Cloud TPU v4 is central to solving these challenges. 

      • Scale: Each Cloud TPU v4 Pod consists of 4096 chips connected together via an ultra-fast interconnect network with the equivalent of an industry-leading 6 terabits per second (Tbps) of bandwidth per host, enabling rapid training for the largest models.

      • Price-performance: Each Cloud TPU v4 chip has ~2.2x more peak FLOPs than Cloud TPU v3, for ~1.4x more peak FLOPs per dollar. Cloud TPU v4 also achieves exceptionally high utilization of these FLOPs for training ML models at scale up through thousands of chips. While many quote peak FLOPs as the basis for comparing systems, it is actually sustained FLOPs at scale that determines model training efficiency, and Cloud TPU v4’s high FLOPs utilization (significantly better than other systems due to high network bandwidth and compiler optimizations) helps yield  shorter training time and better cost efficiency.

      Cloud TPU v4 pods.jpg
      Table 1: Cloud TPU v4 pods deliver state-of-the-art performance through significant advancements in FLOPs, interconnect, and energy efficiency.

      Cloud TPU v4 Pod slices are available in configurations ranging from four chips (one TPU VM) to thousands of chips. While slices of previous-generation TPUs smaller than a full Pod lacked torus links (“wraparound connections”), all Cloud TPU v4 Pod slices of at least 64 chips have torus links on all three dimensions, providing higher bandwidth for collective communication operations.

      Cloud TPU v4 also enables accessing a full 32 GiB of memory from a single device, up from 16 GiB in TPU v3, and offers two times faster embedding acceleration, helping to improve performance for training large-scale recommendation models. 

      Pricing

      Access to Cloud TPU v4 Pods comes in evaluation (on-demand), preemptible, and committed use discount (CUD) options. Please refer to this page for more details.

      Get started today

      We are excited to offer the state-of-the-art ML infrastructure that powers Google services to all of our users, and look forward to seeing how the community leverages Cloud TPU v4's combination of industry-leading scale, performance, sustainability, and cost efficiency to deliver the next wave of ML-powered breakthroughs. 


      Ready to start using Cloud TPU v4 Pods for your AI workloads? Please fill in this form.


      Acknowledgements
      The authors would like to thank the Cloud TPU engineering and product teams for making this launch possible. We also want to thank James Bradbury, Software Engineer, Vaibhav Singh, Outbound Product Manager and Aarush Selvan, Product Manager, for their contributions  to this blog post.

      1. We report a comprehensive trailing twelve-month (TTM) PUE in all seasons, including all sources of overhead.

      Related Article

      Cloud TPU VMs are generally available

      Cloud TPU VMs with Ranking & Recommendation acceleration are generally available on Google Cloud. Customers will have direct access to TP...

      Read Article
    • Google Cloud at I/O: Everything you need to know Wed, 11 May 2022 19:15:00 -0000

      We love this time of year. This week is Google I/O, our largest developer conference, where developer communities from around the world come together to learn, catch up, and have fun. Google Cloud and Google Workspace had a big presence at the show, talking about our commitment to building intuitive and helpful developer experiences to help you innovate freely and quickly. We do the heavy lifting, embedding the expertise from years of Google research in areas like AI/ML and security, so you can easily build secure and intelligent solutions for your customers.

      So, what’s happening at I/O this year? 

      Let’s start with the keynotes… 

      Google I/O keynote 

      Google and Alphabet CEO Sundar Pichai kicked off Day 1 of I/O with a powerhouse keynote highlighting recent breakthroughs in machine learning, including one of the fastest, most efficient, and most sustainable ML infrastructure hubs in the world. Google Cloud's machine learning cluster with Cloud TPU v4 pods (in Preview), allows researchers and developers to make AI breakthroughs by training larger and more complex models faster, to power workloads like large-scale natural language processing (NLP), recommendation systems, and computer vision. With eight TPU v4 pods in a single data center, generating 9 exaflops of peak performance, we believe this system is the world's largest publicly available ML hub in terms of cumulative computing power, while operating at 90% carbon-free energy. Read more about the ML hub with Cloud TPU v4 here.

      “Early access to TPU v4 has enabled us to achieve breakthroughs in conversational AI programming with our CodeGen, a 16-billion parameter auto-regressive language model that turns simple English prompts into executable code.” —Erik Nijkamp, Research Scientist, Salesforce

      “…we saw a 70% improvement in training time for our ‘extremely large’ model when moving from TPU v3 to TPU v4... The exceptionally low carbon footprint of Cloud TPU v4 Pods was another key factor.…”—Aidan Gomez, CEO and Co-Founder, Cohere

      In the keynote, Sundar also announced new AI-enabled features in Google Workspace focused on users, that are designed to help people thrive in the hybrid workplace. New advancements in NLP enable summaries in Spaces to help users catch up on missed conversations by providing a helpful digest. Automated meeting transcription for Google Meet allows users who didn’t attend a meeting to stay in the loop, or for attendees to easily reference the discussion at a later time. Users can also now leverage portrait restore, which automatically improves video image quality — even on devices with lower quality webcams. And they can filter out the reverberation in large spaces with hard surfaces, giving users “conference-room-quality” audio whether they are in their basement, kitchen, or garage. These new features deliver high quality experiences, allowing Google Workspace users to benefit from our AI leadership.

      Developer keynote

      Next up, we heard from Jeanine Banks, Google Vice President of Developer Experiences and DevRel, and a number of product teams who led us through a flurry of exciting new updates about everything from Android to Flutter to Cloud. On the Google Cloud front, we announced the preview of Cloud Run jobs, which can reduce the time developers spend performing administrative tasks such as database migration, managing scheduled jobs like nightly reports, or doing batch data transformation. With Cloud Run jobs, you can execute your code on the highly scalable, fully managed Cloud Run platform, but only pay when your jobs are executing — and without having to worry about managing infrastructure. Learn more about Cloud Run jobs here.

      Then, we announced the preview of AlloyDB for PostgreSQL, a new fully managed, relational database service that gives enterprises the performance, availability, and ease of management they need to migrate from their expensive legacy database systems and onto Google Cloud. AlloyDB combines proven, disaggregated storage and compute that powers our most popular, globally available products such as Google Maps, YouTube, Search, and Ads — with PostgreSQL, an open source database engine beloved by developers.

      Our performance tests show that AlloyDB is four times faster for transaction processing and up to 100 times faster for analytical queries than standard PostgreSQL. It’s also two times faster than AWS’ comparable PostgreSQL-compatible service for transactional workloads. AlloyDB’s fully-managed database operations and ML-based management systems can relieve administrators and developers from daunting database management tasks. Of course, AlloyDB is fully PostgreSQL-compatible, meaning that developers can reuse their existing development skills and tools. It also offers an impressive 99.99% SLA inclusive of maintenance, and no complex licensing or I/O charges. You can learn more about AlloyDB for PostgreSQL here.

      “Developers have many choices for building, innovating and migrating their applications. AlloyDB provides us with a compelling relational database option with full PostgreSQL compatibility, great performance, availability, and cloud integration. We are really excited to co-innovate with Google and can now benefit from enterprise grade features while cost-effectively modernizing from legacy, proprietary databases."—Bala Natarajan, Sr. Director, Data Infrastructure and Cloud Engineering at PayPal

      Cloud keynote - “The cloud built for developers” 

      Moving on to the Cloud keynote, Google Cloud’s very own Aparna Sinha, Director of Product Management, Google Cloud and Google Workspace’s Matthew Izatt, Product Lead, gave the I/O audience exciting cloud updates. Aparna reiterated the benefits of Cloud Run jobs and AlloyDB, while showcasing how our services integrate nicely to give you a full stack specifically tailored for backend, web, mobile and data analytics applications. These stacks also natively embed key security and AI/ML features for simplicity. 

      Specifically, with build integrity, a new feature in Cloud Build, you get out-of-the-box build provenance and “Built by Cloud Build” attestations, including details like the images generated, the input sources, the build arguments, and the built time, helping you achieve up to SLSA Level 2 assurance. Next, you can use Binary Authorization to help ensure that only verified builds with the right attestations are deployed to production. You can get the same results as the experts — without having to be a security expert yourself. 

      Aparna also announced the preview of Network Analyzer, showing how developers can troubleshoot and isolate root causes of complex service disruptions quickly and easily. The new Network Analyzer module in Network Intelligence Center can proactively detect network failures to prevent downtime caused by accidental misconfiguration, over-utilization, and suboptimal routes. Network Analyzer is available for services like Compute Engine, Google Kubernetes Engine (GKE), Cloud SQL, and more. You can visit the Network Analyzer page to learn more. 

      Something that really got the developer audience excited was the announcement of the preview of Immersive Stream for XR allowing you to render eXtended Reality experiences using powerful Google Cloud GPUs, and stream these experiences to mobile devices around the world. Immersive Stream for XR integrates the process of creating, maintaining, and scaling high-quality XR. In fact, XR content delivered using Immersive Stream for XR works on nearly every mobile device regardless of model, year, or operating system. Also, your users can enjoy these immersive experiences simply by clicking a link or scanning a QR code. 

      “We know that our new and existing customers expect unique and innovative campaigns for two of the most unique and innovative vehicles in our brand’s history, and Google Cloud helped us create something very special to share with them.”—Albi Pagenstert, Head of Brand Communications and Strategy, BMW of North America 

      To learn more, visit xr.withgoogle.com, and check out this video to see for yourself!

      And finally, Matthew brought it all home, highlighting the incredible innovation coming from Google Workspace. He detailed how we are making it easier for developers to extend and customize the suite, and simplify integration with existing tools. For example, Google Workspace Add-ons allow you to build applications using your preferred stack and languages; you just build once, and your application is available to use across Google Workspace apps such as Gmail, Google Calendar, Drive and Docs. Matthew also shared how we are improving the development experience by allowing you to easily connect DevOps tools like PagerDuty to the Google Workspace platform. Finally, he noted the critical role that Google Workspace Marketplace can play in increasing the growth and engagement of your application. If you’re interested in learning about how we’re using machine learning to help make people’s work day more productive and impactful, here’s where you can find all of this week’s Workspace news

      Sessions and workshops

      Whew… that was a lot of cloud updates in three keynotes! But wait… there’s more!

      Google Cloud also had 14 cloud breakout sessions and 5 workshops at I/O, covering loads of different topics. Here’s the full list for you, all available on demand:

      Sessions

      1. An introduction to MLOps with TFX

      2. Asynchronous operations in your UI using Workflows and Firestore

      3. Auto alerts for Firebase users with Functions, Logging, and BigQuery

      4. Conversational AI for business messaging

      5. Develop for Google Cloud Platform faster with Cloud Code

      6. Extending Google Workspace with AppSheet’s no-code platform and Apps Script

      7. Fraudfinder: A comprehensive solution for real data science problems

      8. From colab to Cloud in five steps

      9. Learn how to enable shared experiences across platforms

      10. Learn to refactor Cloud applications in Go 1.18 with Generics

      11. Modern Angular deployment with Google Cloud

      12. Run your jobs on serverless

      13. The future of app development with cloud databases

      14. What's new in the world of Google Chat apps

      Workshops

      1. Apply responsible AI principles when building remote sensing datasets

      2. Build an event-driven orchestration with Eventarc and Workflows

      3. Building AppSheet apps with the new Apps Script connector

      4. Faster model training and experimentation with Vertex AI

      5. Spring Native on GCP - what, when, and why?

      And finally, what would I/O be without some massively fun interactive experiences? Take our cloud island at I/O Adventure featuring custom interactive demos and sandboxes. Here, attendees can explore content, chat with Googlers, and earn some really cool swag.  

      So that’s a wrap on Google Cloud announcements at I/O. We’ll have lots more exciting announcements in the next few months that will make your developer experience even simpler and more intuitive. In the meantime, join our developer community, Google Cloud Innovators, where you’ll make lots of awesome new friends. And be sure to register for Google Cloud Next '22 in October. We can’t wait to see you again!

    • Introducing new AI to help people thrive in hybrid work Wed, 11 May 2022 17:30:00 -0000

      We’ve been using machine learning in Google Workspace for some years to help make people’s work day more productive and impactful. Today, we’re announcing new features in Google Workspace that tap into our industry-leading AI to help people thrive and get more done in a hybrid work world.

      We hear from customers—and observe in many of our own teams—that staying on top of the vast amount of information flowing across desks and devices can be a challenge. Information overload isn’t a new phenomenon, but many of our customers say that hybrid work has increased the sheer volume of emails, chats, and meetings for their organizations. Our latest AI innovations are designed to help employees bring focus to what matters, collaborate securely, and strengthen human connections across the ways and places work happens.

      Helping people focus on what’s important

      Using our advancements in natural language processing, we recently introduced automated summaries in Google Docs. In the coming months, we’re extending built-in summaries to Spaces to provide a helpful digest of conversations. Summaries allow you to catch up quickly and easily on what you’ve missed from conversations in Spaces.

      Gmail_IO_ChatSummary_GIF_1000x563_9MB.gif
      Summaries in Spaces help you catch up quickly on conversations

      To help boost participation, we’re also introducing automated meeting transcription for Google Meet, allowing those who didn’t attend a meeting to stay in the loop, or for attendees to easily reference the discussion at a later time. We're releasing automated transcription later this year, and meeting summarization next year.

      Helping to ensure everyone in the team can be seen and heard 

      We’re using machine learning to make the meeting experience in Google Meet more immersive and meaningful. In turn, these enhancements can help strengthen human connections in a hybrid work world.

      To make it easier for people to connect and share rich content in Google Meet, we’re delivering enhancements to image, sound, and content sharing capabilities later this year. Portrait restore uses Google’s AI to help improve your video quality by addressing issues caused by low light, low quality webcams, or poor network connectivity. This processing automatically happens in the cloud to enhance video quality without impacting device performance.

      2 Portrait restore.gif
      Portrait restore improves video quality using Google AI

      Historically, lighting has been a challenge in video conferencing but it can make a huge difference in how you show up on screen. Portrait light uses machine learning to simulate studio-quality lighting in your video feed, and you can even adjust the light position and brightness so you’re seen the way you want to be.

      3 Portrait light.gif
      Portrait light brings studio quality lighting effects to Google Meet

      De-reverberation, meanwhile, filters out the echoes in spaces with hard surfaces, that can give you conference-room audio quality whether you’re in a basement, a kitchen, or a big empty room. 

      Live sharing in Google Meet can make hybrid meetings more interactive by synchronizing media and content across participants. Users will be able to share controls and interact directly within the meeting, whether it’s watching an icebreaker video from YouTube or sharing a playlist. Our partners and developers can use our live sharing APIs today to start integrating Meet into their apps.

      Helping to ensure hybrid collaboration is secure by design

      Collaboration—across time zones, physical locations, and documents—must happen on a secure foundation. Google Workspace is built with a zero-trust approach and comes with enterprise-grade access management, data protection, encryption, and endpoint protections built in.  We keep more people safe online than anyone else in the world, and Gmail blocks more than 99.9% of spam and phishing messages from ever reaching users’ inboxes. 

      Our systems constantly learn from each attempted attack against the billions of users who rely on our products. These insights enable us to anticipate and thwart new attacks by identifying emerging patterns and entry points. Attackers are creative and determined—and with the recent increase in remote and multi-location collaboration, there’s been a trend toward new attack patterns within shared docs. That’s why later this year we’re scaling the phishing and malware protections that guard Gmail to Google Slides, Docs, and Sheets. If a Doc, Sheet, or Slide you’re about to access contains phishing links or malware, you’ll automatically be alerted and guided back to safety.

      Investing in people-first collaboration for a hybrid future

      At Google, we think AI can meaningfully improve people’s lives and that everyone should have access to its benefits. As hybrid work evolves, we’ll continue to infuse intelligent capabilities across Google Workspace—in the apps familiar to billions of users—so that it’s easier for employees to bring focus to their top priorities, fully participate, and collaborate from anywhere. Unlocking new ways for people to achieve more together is crucial if teams and organizations are going to thrive in the new world of work.

    • Extending BigQuery Functions beyond SQL with Remote Functions, now in preview Wed, 11 May 2022 16:00:00 -0000

      Today we are announcing the Preview of BigQuery Remote Functions. Remote Functions are user-defined functions (UDF) that let you extend BigQuery SQL with your own custom code, written and hosted in Cloud Functions, Google Cloud’s scalable pay-as-you-go functions as a service.  A remote UDF accepts columns from BigQuery as input, performs actions on that input using a Cloud Function, and returns the result of those actions as a value in the query result. With Remote Functions, you can now write custom SQL functions in Node.js, Python, Go, Java, NET, Ruby, or PHP. This ability means you can personalize BigQuery for your company, leverage the same management and permission models without having to manage a server.

      1 Extending BigQuery Functions.jpg

      In what type of situations could you use remote functions?

      Before today, BigQuery customers had the ability to create user defined functions or UDFs in either SQL or javascript that ran entirely within BigQuery. While these functions are performant and fully managed from within BigQuery, customers expressed a desire to extend BigQuery UDFs with their own external code. Here are some examples of what they have asked for:

      • Security and Compliance: Use data encryption and tokenization services from the Google Cloud security ecosystem for external encryption and de-identification. We’ve already started working with key partners like Protegrity and CyberRes Voltage on using these external functions as a mechanism to merge BigQuery into their security platform, which will help our mutual customers address strict compliance controls. 
      • Real Time APIs: Enrich BigQuery data using external APIs to obtain the latest stock price data, weather updates, or geocoding information.
      • Code Migration: Migrate legacy UDFs or other procedural functions written in Node.js, Python, Go, Java, .NET, Ruby or PHP. 
      • Data Science: Encapsulate complex business logic and score BigQuery datasets by calling models hosted in Vertex AI or other Machine Learning platforms.

      Getting Started

      Let’s go through the steps to use a BigQuery remote UDF. 

      Setup the BigQuery Connection:
         1. Create a BigQuery Connection 
           a. You may need to enable the BigQuery Connection API

      Deploy a Cloud Function with your code:
         1. Deploying your Cloud Function
           a. You may need to enable Cloud Functions API
           b. You may need to enable Cloud Build APIs

         2. Grant the BigQuery Connection service account access to the Cloud Function
           a. One way you can find the service account is by using the bq cli show command

      code_block
      [StructValue([(u'code', u'bq show --location=US --connection $CONNECTION_NAME'), (u'language', u'')])]

      Define the BigQuery remote UDF: 
         1. Create the remote UDFs definition within BigQuery 
           a. One way to find the endpoint name is to use the gCloud cli functions describe command

      code_block
      [StructValue([(u'code', u'gcloud functions describe $FUNCTION_NAME'), (u'language', u'')])]

      Use the BigQuery remote UDF in SQL:
         1. Write a SQL statement as you would calling a UDF 
         2. Get your results! 

      How remote functions can help you with common data tasks

      Let’s take a look at some examples of how using BigQuery with remote UDFs can help accelerate development and enhance data processing and analysis.

      Encryption and Decryption

      As an example, let’s create a simple custom encryption and decryption Cloud Function in Python. 

      The encryption function can receive the data and return an encrypted base64 encoded string. 

      In the same Cloud Function, the decryption function can receive an encrypted base64 encoded string and return the decrypted string. A data engineer would be able to enable this functionality in BigQuery.

      The Cloud Function receives the data and determines which function you want to invoke. The data is received as an HTTP request. The additional userDefinedContext fields allow you to send additional pieces of data to the Cloud Function.

      code_block
      [StructValue([(u'code', u'def remote_security(request):\r\n request_json = request.get_json()\r\n mode = request_json[\'userDefinedContext\'][\'mode\']\r\n calls = request_json[\'calls\']\r\n not_extremely_secure_key = \'not_really_secure\'\r\n if mode == "encryption":\r\n return encryption(calls, not_extremely_secure_key)\r\n elif mode == "decryption":\r\n return decryption(calls, not_extremely_secure_key)\r\n return json.dumps({"Error in Request": request_json}), 400'), (u'language', u'')])]

      The result is returned in a specific JSON formatted response that is returned to BigQuery to be parsed.

      code_block
      [StructValue([(u'code', u'def encryption(calls,not_extremely_secure_key):\r\n return_value = []\r\n for call in calls:\r\n data = call[0].encode(\'utf-8\')\r\n cipher = AES.new(\r\n not_extremely_secure_key.encode(\'utf-8\')[:16],\r\n AES.MODE_EAX\r\n )\r\n cipher_text = cipher.encrypt(data)\r\n return_value.append(\r\n str(base64.b64encode(cipher.nonce + cipher_text))[2:-1]\r\n )\r\n return json.dumps({"replies": return_value})'), (u'language', u'')])]

      This Python code is deployed to Cloud Functions where it awaits to be invoked.

      Let’s add the User Defined Function to BigQuery so we can invoke it from a SQL statement. The additional user_defined_context is what is sent to Cloud Functions as additional context in the request payloadso you can use multiple remote functions mapped to one endpoint.

      code_block
      [StructValue([(u'code', u'CREATE OR REPLACE FUNCTION `<project-id>.demo.decryption` (x STRING) RETURNS STRING REMOTE WITH CONNECTION `<project-id>.us.my-bq-cf-connection` OPTIONS (endpoint = \'https://us-central1-<project-id>.cloudfunctions.net/remote_security\', user_defined_context = [("mode","decryption")])'), (u'language', u'')])]

      Once we’ve created our functions, users with the right IAM permissions can use them in SQL on BigQuery.

      2 Extending BigQuery Functions.jpg

      If you’re new to Cloud Functions, be aware that there are very minimal delays known as “cold starts”. 

      The neat thing is you can call APIs as well, which is how our partners at Protegrity and Voltage enable their platforms to perform encryption and decryption of BigQuery data.

      Calling APIs to enrich your data

      Users, such as data analysts, can use the user defined functions created easily without needing other tools and moving the data out of BigQuery.

      You can enrich your dataset with many more APIs, for example, the Google Cloud Natural Language API to analyze sentiment on your text without having to use another tool.

      code_block
      [StructValue([(u'code', u'def call_nlp(calls):\r\n return_value = []\r\n client = language_v1.LanguageServiceClient()\r\n for call in calls:\r\n text = call[0]\r\n document = language_v1.Document(\r\n content=text, type_=language_v1.Document.Type.PLAIN_TEXT\r\n )\r\n sentiment = client.analyze_sentiment(\r\n request={"document": document}\r\n ).document_sentiment\r\n return_value.append(str(sentiment.score))\r\n return_json = json.dumps({"replies": return_value})\r\n return return_json'), (u'language', u'')])]

      Once the Cloud Function is deployed and the remote UDF definition is created on BigQuery, you are able to invoke the NLP API and return the data from it for use in your queries.

      3 Extending BigQuery Functions.jpg

      Custom Vertex AI endpoint

      Data Scientists can integrate Vertex AI endpoints and other APIs, all from the SQL console for custom models. 

      Remember, the remote UDFs are meant for scalar executions.

      You are able to deploy a model to a Vertex AI endpoint, which is another API, and then call that endpoint from Cloud Functions.

      code_block
      [StructValue([(u'code', u'def predict_classification(calls):\r\n # Vertex AI endpoint details\r\n client = aiplatform.gapic.PredictionServiceClient(client_options=client_options)\r\n endpoint = client.endpoint_path(\r\n project=project, location=location, endpoint=endpoint_id\r\n )\r\n # Call the endpoint for each\r\n for call in calls:\r\n content = call[0]\r\n instance = predict.instance.TextClassificationPredictionInstance(\r\n content=content,\r\n ).to_value()\r\n instances = [instance]\r\n parameters_dict = {}\r\n parameters = json_format.ParseDict(parameters_dict, Value())\r\n response = client.predict(\r\n endpoint=endpoint, instances=instances, parameters=parameters\r\n )'), (u'language', u'')])]
      4 Extending BigQuery Functions.jpg

      Related Article

      Read Article
    • Security through collaboration: Building a more secure future with Confidential Computing Tue, 10 May 2022 19:00:00 -0000

      At Google Cloud, we believe that the protection of our customers’ sensitive data is paramount, and encryption is a powerful mechanism to help achieve this goal. For years, we have supported encryption in transit when our customers ingest their data to bring it to the cloud. We’ve also long supported encryption at rest, for all customer content stored in Google Cloud. 

      To complete the full data protection lifecycle, we can protect customer data when it’s processed through our Confidential Computing portfolio. Confidential Computing products from Google Cloud protect data in use by performing computation in a hardware isolated environment that is encrypted with keys managed by the processor and unavailable to the operator. These isolated environments help prevent unauthorized access or modification of applications and data while in use, thereby increasing the security assurances for organizations that manage sensitive and regulated data in public cloud infrastructure. 

      Secure isolation has always been a critical component of our cloud infrastructure; with Confidential Computing, this isolation is cryptographically reinforced. Google Cloud’s Confidential Computing products leverage security components in AMD EPYC™ processors including AMD Secure Encrypted Virtualization (SEV) technology.

      Building trust in Confidential Computing through industry collaboration

      Part of our mission to bring Confidential Computing technology to more cloud workloads and services is to make sure that the hardware and software used to build these technologies is continuously reviewed and tested. We evaluate different attack vectors to help ensure Google Cloud Confidential Computing environments are protected against a broad range of attacks. As part of this evaluation, we recognize that the secure use of our services and the Internet ecosystem as a whole depends on interactions with applications, hardware, software, and services that Google doesn't own or operate. 

      The Google Cloud Security team, Google Project Zero, and the AMD firmware and product security teams collaborated for several months to conduct a detailed review of the technology and firmware that powers AMD Confidential Computing technology. This review covered both Secure Encrypted Virtualization (SEV) capable CPUs, and the next generation of Secure Nested Paging (SEV-SNP) capable CPUs which protect confidential VMs against the hypervisor itself. The goal of this review was to work together and analyze the firmware and technologies AMD uses to help build Google Cloud’s Confidential Computing services to further build trust in these technologies.

      This in-depth review focused on the implementation of the AMD secure processor in the third generation AMD EPYC processor family delivering SEV-SNP. SNP further improves the posture of confidential computing using technology that removes the hypervisor from the trust boundary of the guest, allowing customers to treat the Cloud Service Provider as another untrusted party. The review covered several AMD secure processor components and evaluated multiple different attack vectors. The collective group reviewed the design and source code implementation of SEV, wrote custom test code, and ran hardware security tests, attempting to identify any potential vulnerabilities that could affect this environment.

      PCIe hardware pentesting using an IO screamer.jpg
      PCIe hardware pentesting using an IO screamer

      Working on this review, the security teams identified and confirmed potential issues of varying severity. AMD was diligent in fixing all applicable issues and now offers updated firmware through its OEM channels. Google Cloud’s AMD-based Confidential Computing solutions now include all the mitigations implemented during the security review.

      “At Google, we believe that investing in security research outside of our own platforms is a critical step in keeping organizations across the broader ecosystem safe,” said Royal Hansen, vice president of Security Engineering at Google. “At the end of the day, we all benefit from a secure ecosystem that organizations rely on for their technology needs and that is why we’re incredibly appreciative of our strong collaboration with AMD on these efforts.” 

      “Together, AMD and Google Cloud are continuing to advance Confidential Computing, helping enterprises to move sensitive workloads to the cloud with high levels of privacy and security, without compromising performance,” said Mark Papermaster, AMD’s executive vice president and chief technology officer. ”Continuously investing in the security of these technologies through collaboration with the industry is critical to providing customer transformation through Confidential Computing. We’re thankful to have partnered with Google Cloud and the Google Security teams to advance our security technology and help shape future Confidential Computing innovations to come.”  

      Reviewing trusted execution environments for security is difficult given the closed-source firmware and proprietary hardware components. This is why research and collaborations such as this are critical to improve the security of foundational components that support the broader Internet ecosystem. AMD and Google believe that transparency helps provide further assurance to customers adopting Confidential Computing, and to that end AMD is working toward a model of open source security firmware.

      With the analysis now complete and the vulnerabilities addressed, the AMD and Google security teams agree that the AMD firmware which enables Confidential Computing solutions meets an elevated security bar for customers, as the firmware design updates mitigate several bug classes and offer a way to recover from vulnerabilities. More importantly, the review also found that Confidential VMs are protected against a broad range of attacks described in the review.

      Google Cloud’s Confidential Computing portfolio 

      The Google Cloud Confidential VMs, Dataproc Confidential Compute, and Confidential GKE Nodes have enabled high levels of security and privacy to address our customers’ data protection needs without compromising usability, performance, and scale. Our mission is to make this technology ubiquitous across the cloud. Confidential VMs run on hosts with AMD EPYC processors which feature AMD Secure Encrypted Virtualization (SEV). Incorporating SEV into Confidential VMs provide benefits and features including: 

      • Isolation: Memory encryption keys are generated by the AMD Secure Processor during VM creation and reside solely within the AMD Secure Processor. Other VM encryption keys such as for disk encryption can be generated and managed by an external key manager or in Google Cloud HSM. Both sets of these keys are not accessible by Google Cloud, offering strong isolation. 

      • Attestation: Confidential VMs use Virtual Trusted Platform Module (vTPM) attestation. Every time a Confidential VM boots, a launch attestation report event is generated and posted to customer cloud logging, which gives administrators the opportunity to act as necessary.

      • Performance: Confidential Computing offers high performance for demanding computational tasks. Enabling Confidential VM has little or no impact on most workloads. 

      The future of Confidential Computing and secure platforms

      While there are no absolutes in computer security, collaborative research efforts help uncover security vulnerabilities that can emerge in complex environments and help to prevent Confidential Computing solutions from threats today and into the future. Ultimately, this helps us increase levels of trust for customers. 

      We believe Confidential Computing is an industry-wide effort that is critical for securing sensitive workloads in the cloud and are grateful to AMD for their continued collaboration on this journey. 

      To read the full security review, visit this page


      Acknowledgments 

      We thank the many Google security team members who contributed to this ongoing security collaboration and review, including James Forshaw, Jann Horn and Mark Brand.

      We are grateful for the open collaboration with AMD engineers, and wish to thank David Kaplan, Richard Relph and Nathan Nadarajah for their commitment to product security. We would also like to thank AMD leadership: Ab Nacef, Prabhu Jayanna, Hugo Romero, Andrej Zdravkovic and Mark Papermaster for their support of this joint effort.

      Related Article

      Expanding Google Cloud’s Confidential Computing portfolio

      Google Cloud Confidential Computing is now GA and including Confidential GKE Nodes.

      Read Article
    • 3co reinvents the digital shopping experience with augmented reality on Google Cloud Tue, 10 May 2022 16:00:00 -0000

       Giving people as close to a “try-before-you-buy” experience is essential for retailers. With the move to online shopping further accelerated by the COVID-19 pandemic, many people are now comfortable shopping online for items they previously only considered buying in stores. The problem for shoppers is that it still can be difficult to get what feels like more hands-on experiences of items given limitations with even some of today’s most advanced augmented reality (AR) technologies. And while retailers continue to invest heavily in creating the most life-like digital experiences possible, the results often come up short for shoppers with more digital buying options than ever. 

      To make AR experiences more convincing for shoppers—and for anyone wanting richer, more immersive experiences in entertainment and other industries—the depiction of real-world physical objects in digital spaces needs to continue to improve and evolve. As avid plant lovers, we knew the experience of viewing and buying plants online was severely lacking. That prompted our initial exploration into rethinking what’s possible with AR: we built a direct-to-consumer app for buying plants in AR. However, during our time in the Techstars program, we quickly realized that improving how people see and experience plants online was just a fraction of a much bigger, multi-billion-dollar opportunity for us. Since 2018, 3co has been laser-focused (quite literally) on scaling 3D tech for all of e-commerce.

      1 3co.jpg
      2 3co.jpg
      An automated 3D scanning system for photorealistic 3D modeling of retail products, designed by 3co and powered by Google Cloud.

      Closing the gap between imagination and reality with Google Cloud

      With that in mind, 3co began developing breakthroughs needed in 3D computer vision. Our advanced artificial intelligence (AI) stack is designed to give companies an all-in-one 3D commerce platform to easily and cost-effectively create realistic 3D models of physical objects and stage them in virtual showrooms.

      When building our AR platform, we quickly understood that engineering 3D simulations with sub-perceptual precision requires an enormous amount of compute power. Fortunately the problems are parallelizable. But it simply isn’t possible to 3D model the complex real world with superhuman precision on conventional laptops or desktops.

      As a part of the Google for Startups Cloud Program, Startup Success Managers helped 3co plug into the full power of Google’s industry-leading compute capabilities. For several projects, we selected a scalableCompute Engine powerful enough to solve even the most complex 3D graphics optimizations at scale. Today with the A2 virtual machine, 3co leverages NVIDIA Ampere A100 Tensor Core GPUs to create more life-like 3D renderings over ten times faster. And this is just the beginning.

      We’re also proud to have deployed a customized streaming GUI on top of Google’s monstrous machines, which allowed our colleagues across the world (including in Amsterdam and Miami) to plug-and-play with the latest 3D models on a world-class industrial GPU. I would highly recommend to companies solving super hard AI and/or 3D challenges in a distributed team to consider adopting cloud resources in the same way. It was a delight to see Blender render gigabyte 3D models faster than ever before in my life.

      3 3co.jpg
      GUI for 3D modeling, streamed from Google Cloud computers by 3co, which unlocked previously impossible collaborative workflows on gigabyte-sized 3D models.

      Equally critical, with our technology, 3D artists in retail, media and entertainment, and other industries pressured to deliver more—and more immersive AR—experiences can reduce costs and speed to generate photorealistic 3D models, as much as tenfold. We know this from our own work because we’ve seen computing costs to generate the highest-quality 3D experiences drop significantly—even though we run an advanced Compute Engine loaded with a powerful GPUs, high-end CPUs, and massive amounts of RAM. If the goal is to scale industry-leading compute power quickly for a global customer base, Google Cloud is the proper solution. 

      Cloud Storage is another key but often overlooked component of the Google Cloud ecosystem, critical for 3co. We need the high throughput, low latency, and instant scalability delivered bylocal cloud SSDs to support the massive amounts of data we generate, store, and stream. The local SSDs complement our A2 compute engines and are physically attached to the servers hosting the virtual machine instances. This local configuration supports extremely high input/output operations per second (IOPS) with very low latency compared to persistent disks.

      To top it off,Cloud Logging delivers us real-time log management at exabyte scale — ingesting analytic events that are streamed to data lakes withPub/Sub – so we can know while enjoying the beach here in Miami, Florida that everything is going smoothly in the cloud.

      Building the 3co AI stack with TensorFlow

      Building one of the world's most advanced 3D computer vision solutions would not have been possible withoutTensorFlow and its comprehensive ecosystem of tools, libraries, and community resources. Since the launch of TensorFlow in 2015, I’ve personally built dozens of deep learning systems using this battle-hardened technology, an open source Google API for AI. Through TensorFlow on Google Cloud, 3co is able to scale its compute power for creation of truly photorealistic digital models of physical objects — down to microscopic computation of material textures, and deep representations of surface light transport from all angles.  

      Most recently, 3co has been making massive progress on top of the TensorFlow implementation of Neural Radiance Fields (“NeRF”, Mildenhall et al. 2020). We are humbled to note that this breakthrough AI in TensorFlow truly is disruptive for the 3D modeling industry: we anticipate the next decade in 3D modeling will be increasingly shaped and colored by similar neural networks (I believe the key insight of the original authors of NeRF is to force a neural network to learn a physics-based model of light transport). For our contribution, 3co is now (1) adapting NeRF-like neural networks to optimally leverage sensor data from various leading devices for 3D computer vision, and (2) forcing these neural networks to learn industry-standard 3D modeling data structures, which can instantly plug-and-play on the leading 3D platforms. As Isaac Newton said, “If I have seen further, it is by standing on the shoulders of giants.” That is, tech giants.

      In several ways, TensorFlow is the go-to solution both for prototyping and for large-scale deployment of AI in general. Under-the-hood, TensorFlow uses a sophisticated compiler (XLA) for optimizing how computations are allocated on underlying hardware.

      4 3co.jpg
      3co achieved a 10x speed-up in neural network training time (for inverse rendering optimization), by compiling its computations with TensorFlow XLA.

      Unlike its competitors (e.g. PyTorch, JAX), TensorFlow can also compile binaries to run on TPUs (i.e. TFLite) and across device architectures (e.g. iOS, Android, JavaScript). This ability is important because 3co is committed to delivering 3D computer vision wherever it is needed, with maximum speed and accuracy. Through TensorFlow on Google Cloud, 3co has been able to speed up experimental validation of patent-pending 3D computer vision systems that can run the same TensorFlow code across smartphones, LIDAR scanners, AR glasses, and so much more.

      5 3co.jpg
      3co is developing an operating system for 3D computer vision powered by TensorFlow, in order to unify development of a single codebase for AI, across the most common sensors & processors.

      TensorFlow also enables 3co’s neural networks to train faster, through an easy API for distributed training across many computers. Distributed deep learning was the focus of my masters thesis in 2013 (inspired by work from Jeff Dean, Andrew Ng, and Google Brain), so you can imagine how excited I was to see Google optimize these industry-leading capabilities for the open source community, over the following years. Parallelization of deep learning has consistently proven essential for creating this advanced AI, and 3co is no exception to this rule. As well, with faster AI training means faster conclusion of R&D experiments. As Sam Altman says, “The number one predictor of success for a very young startup: rate of iteration”. From day one, TensorFlow was built to speed up Google’s AI computing challenges at the biggest scale, but it also “just works” at the earliest stages of exploration. Through TensorFlow on Google Cloud, 3co is steadily improving our capabilities for autonomous photorealistic 3D modeling. Simple and flexible architectures for fast experimentation enable us to quickly move from concept to code, from code to state-of-the-art deployed ML models. Thus, Google has given 3co through TensorFlow a powerful tool needed to better serve customers with their modern AI and computer vision. 

      In the future, 3co has big plans involving supercomputers of Google Cloud Tensor Processing Units (TPUs), so we plan to achieve even greater speed and cost optimization. Running TensorFlow on Cloud TPUs requires just a little bit of extra work by the AI developer, but Google is increasingly making it easier to plug-and-play on these gargantuan computing architectures. They truly are world class servers for AI. I remember being as excited as a little boy in a candy store, reading research back in 2017 on Google’s TPUs, which was the climax of R&D for literally dozens of super smart computer engineers. Since then, several versions of TPUs have been deployed internally at Google for many kinds of applications (e.g. Google Translate), and increasingly have been made more useful and accessible. Startups like 3co – and our customers – can benefit so much here. Through the use of advanced computer processors like TPUs, 3co expects to parallelize its AI to perform photorealistic 3D modeling of real scenes in real-time. Imagine the possibilities for commerce, gaming, entertainment, design, and architecture that this ability could unlock. 

      Scaling 3D commerce with Google Cloud and credits

      3co’s participation in the Google for Startups Cloud Program (facilitated via Techstars, we also can’t thank them enough) has been instrumental to our success in closing the gap between imagination and reality. It’s a mission we’ve been working on for years – and will continue to hone for many years to come. And this success is thanks to the Google for Startups Success team: they are truly amazing. They just care about you. If you’re a startup founder, just reach out to them: they really do wonders. We especially want to highlight the Google Cloud research credits which provided 3co access to vastly greater amounts of compute power. We are so grateful to Google Cloud for enabling 3co to scale its 3D computer vision services to customers worldwide. I love that 3co is empowered by Google to help many people see the world in a new light.  


      If you want to learn more about how Google Cloud can help your startup, visit our pagehere to get more information about our program, and sign up for our communications to get a look at our community activities, digital events, special offers, and more.

      Related Article

      The Future of Data: Unified, flexible, and accessible

      Google Cloud’s whitepaper explores why the future of data will involve three key themes: unified, flexible, and accessible.

      Read Article
    • How Google Cloud and SAP solve big problems for big companies Tue, 10 May 2022 13:00:00 -0000

      With SAP Sapphire kicking off today in Orlando, we’re looking forward to seeing our customers and discussing how they can make core processes more efficient and improve how they serve their customers.

      One thing is certain to be top of mind – the global supply chain challenges facing the world today. It’s affecting every business across every industry, from common household items that once filled store shelves and are now on backorder, to essential goods and services like food and medical treatments, which are at risk. Even cloud-native companies are making changes to ensure they have the insights, equipment, and other assets they need to continue serving customers. 

      We are proud to work with SAP on many initiatives that are driving results for our customers and helping them run more intelligent and sustainable companies. I’d like to highlight three of these important initiatives and how they are helping address global supply chain challenges. 

      Enabling more efficient migrations of critical workloads 

      We know a key barrier to entry in the cloud is the ability to easily migrate from on-premises environments. Our cloud provides a safe path to help companies including Johnson Controls, PayPal, and Kaeser Compressor to digitize and solve large, complex business problems, reduce costs, scale without cycles of investment, and gain access to key services and capabilities that can unlock value and enable growth. 

      Singapore-based shipping company Ocean Network Express (ONE) has become more agile by running their mission-critical SAP workloads on Google Cloud and using our data analytics to improve operational efficiency and make faster decisions. They have gone from an on-premises data warehouse solution that would take a full day loading data from SAP S/4HANA, to using our BigQuery solution that delivers business insights in minutes.

      Since The Home Depot moved its critical SAP workloads to Google Cloud, the company has been able to shorten the time it takes to prepare a supply chain use case from 8 hours to 5 minutes by using BigQuery to analyze large volumes of internal and external data. This helps improve forecast accuracy and more effectively replenish inventory by being able to create a new plan when circumstances change unexpectedly with demand or a supplier.

      Accelerating cloud benefits through RISE and LiveMigration 

      At Google Cloud, we have dedicated programs to help migrate SAP and other mission-critical workloads to our cloud with our Cloud Acceleration Program for SAP.

      For SAP customers moving to Google Cloud, we provide LiveMigration to provide superior uptime and business continuity. LiveMigration eliminates downtime required for planned infrastructure maintenance. This means that your SAP system continues running even when Google Cloud is performing planned infrastructure maintenance upgrades thus ensuring superior business continuity for your mission critical workloads. 

      We are also proud to be a strategic partner with the RISE with SAP program, which helps accelerate cloud migration for SAP’s global customer base while minimizing risks along the migration journey. This program provides solutions and expertise from SAP and technology ecosystem partners to help companies transform through process consulting, workload migration services, cloud infrastructure, and ongoing training and support. To secure your mission critical workloads, SAP and Google Cloud can provide a 99.9% uptime SLA as part of the RISE with SAP program.

      Many large manufacturers have taken advantage of RISE with SAP to forge a secure, proven path to our cloud, including Energizer Holdings Inc., a leading manufacturer and distributor of primary batteries, portable lights, and auto care products. Energizer has turned to RISE with SAP on Google Cloud to power its move to SAP S/4HANA. The company wants to automate essential business processes, improve customer service, and boost innovation. It had been using a private cloud solution but needed to gain flexibility while better containing costs.

      “SAP S/4HANA for central finance will help us automate essential business processes, improve customer service, and fuel innovation that grows our company’s leadership position globally. We selected RISE with SAP to begin our journey to SAP S/4HANA and maintain the freedom and flexibility to move at our own pace,” said Energizer Chief Information Officer Dan McCarthy.

      Another example is global automotive distributor Inchcape, which moved its mission-critical sales, marketing, finance, and operations systems and data to Google Cloud. With its diverse data sets now in a single, secure cloud platform, Inchcape is applying Google Cloud AI and ML capabilities to manage and analyze its data, automate operations, and ultimately transform the car ownership experience for millions. 

      "Google Cloud's close relationship with SAP and its strong technical expertise in this space were a big pull for us,” said Mark Dearnley, Chief Digital Officer at Inchcape. “Ultimately, we wanted a headache-free RISE with SAP implementation and to unlock value for auto makers and consumers in all our regions, while continuing to have the choice and flexibility to modernize our 150-year old business in a way that works for us." 

      A new intelligence layer for all SAP Google Cloud customers

      When moving mission-critical workloads to the cloud, companies not only need to migrate safely, they also need to quickly realize value, which we enable with Google Cloud Cortex Framework — a layer of intelligence that integrates with SAP Business Technology Platform (SAP BTP). Google Cloud Cortex Framework provides reference architectures, deployment accelerators, and integration services for analytics scenarios. 

      Like many large e-commerce companies, Mercado Libre experienced skyrocketing transactions that more than doubled in 2020 as people sheltered at home during the pandemic, and they are anticipating more growth. The Google Cloud Cortex Framework is enabling Mercado Libre to respond, run more efficiently, and make faster, data-driven decisions.

      Continued partnership to support organizations around the world 

      Our longstanding partnership with SAP continues to yield exciting innovations for our customers, and we’re honored to work with them to help customers address the ongoing impact of global supply chain challenges. We’re looking forward to sharing new insights and innovations at SAP Sapphire this week, and to listening and learning from you about your plans and challenges, and how we can best support your transformation to the cloud.

      Related Article

      6 SAP companies driving business results with BigQuery

      SAP systems generate large amounts of key operational data. Learn how six Google Cloud customers are leveraging BigQuery to drive value f...

      Read Article
    • Cloud TPU VMs are generally available Mon, 09 May 2022 23:00:00 -0000

      Earlier last year, Cloud TPU VMs on Google Cloud were introduced to make it easier to use the TPU hardware by providing direct access to TPU host machines. Today, we are excited to announce the general availability (GA) of TPU VMs.

      With Cloud TPU VMs you can work interactively on the same hosts where the physical TPU hardware is attached. Our rapidly growing TPU user community has enthusiastically adopted this access mechanism, because it not only makes it possible to have a better debugging experience, but it also enables certain training setups such as Distributed Reinforcement Learning which were not feasible with TPU Node (networks accessed) architecture.

      What’s new for the GA release?

      Cloud TPUs are now optimized for large-scale ranking and recommendation workloads. We are also thrilled to share that Snap, an early adopter of this new capability, achieved about ~4.65x perf/TCO improvement to their business-critical ad ranking workload. Here are a few highlights from Snap’s blog post on Training Large Scale Recommendation Models:

      > TPUs can offer much faster training speed and significantly lower training costs for recommendation system models than the CPUs;
      > TensorFlow for cloud TPU provides a powerful API to handle large embedding tables and fast lookups;
      > On TPU v3-32 slice, Snap was able to get a ~3x better throughput (-67.3% throughput on A100) with 52.1% lower cost compared to an equivalent A100 configuration (~4.65x perf/TCO)

      Ranking and recommendation

      With the TPU VMs GA release, we are introducing the new TPU Embedding API, whichcan accelerate ML Based ranking and recommendation workloads.

      Many businesses today are built around ranking and recommendation use-cases, such as audio/video recommendations, product recommendations (apps, e-commerce), and ad ranking. These businesses rely on ranking and recommendation algorithms to serve their users and drive their business goals. In the last few years, the approaches to these algorithms have evolved from being purely statistical to deep neural network-based. These modern DNN-based algorithms offer greater scalability and accuracy, but they can come at a cost. They tend to use large amounts of data and can be difficult and expensive to train and deploy with traditional ML infrastructure.

      Embedding acceleration with Cloud TPU can solve this problem at a lower cost. Embedding APIs can efficiently handle large amounts of data, such as embedding tables, by automatically sharding across hundreds of Cloud TPU chips in a pod, all connected to one another via the custom-built interconnect.

      To help you get started, we are releasing the TF2 ranking and recommendation APIs, as part of the Tensorflow Recommenders library. We have also open sourced DLRM and DCN v2 ranking models in the TF2 model garden and the detailed tutorials are available here.

      Framework support

      TPU VM GA Release supports the three major frameworks (TensorFlow, PyTorch and JAX) now offered through three optimized environments for ease of setup with the respective framework. GA release has been validated with TensorFlow v2-tf-stable, PyTorch/XLA v1.11and JAX [0.3.6].

      TPU VMs Specific Features

      TPU VMs offer several additional capabilities over TPU Node architecture thanks to the local execution setup, i.e. TPU hardware connected to the same host that users execute the training workload(s).

      Local execution of input pipeline 

      Input data pipeline executes directly on the TPU hosts. This functionality allows saving precious computing resources earlier used in the form of instance groups for PyTorch/JAX distributed training. In the case of Tensorflow, the distributed training setup required only one user VM and data pipeline executed directly on TPU hosts.

      The following study summarizes the cost comparison for Transformer (FairSeq; PyTorch/XLA) training executed for 10 epochs on TPU VM vs TPU Node architecture (Network attached Cloud TPUs):

      fairseq transformer.jpg
      Google Internal data (published benchmarkconducted on Cloud TPU by Google).

      Distributed Reinforcement Learning with TPU VMs

      Local execution on the host with the accelerator, also enables use cases such as Distributed Reinforcement Learning. Canonical works in this domain such as seed-RL, IMPALA and Podracer have been developed using Cloud TPUs.


      “..., we argue that the compute requirements of large scale reinforcement learning systems are particularly well suited for making use of Cloud TPUs , and specifically TPU Pods: special configurations in a Google data center that feature multiple TPU devices interconnected by low latency communication channels. “—Podracer, DeepMind


      Custom Ops Support for TensorFlow

      With direct execution on TPU VM, users can now build their own custom ops such as TensorFlow Text. With this feature, the users are no longer bound to TensorFlow runtime release versions.

      What are our customers saying?

      “Over the last couple of years, Kakao Brain has developed numerous groundbreaking AI services and models, including minDALL-E, KoGPT and, most recently, RQ-Transformer. We’ve been using TPU VM architecture since its early launch on Google Cloud, and have experienced significant performance improvements compared to the original TPU node set up. We are very excited about the new features added in the Generally Available version of TPU VM, such as Embeddings API, and plan to continue using TPUs to solve some of the globe's biggest 'unthinkable questions' with solutions enabled by its lifestyle-transforming AI technologies”—Kim Il-doo, CEO of Kakao Brain

      Additional Customers’ testimonials are available here.

      How to get started?

      To start using TPU VM, you can follow one of our quick starts or tutorials. If you are new to TPUs you can explore our concepts deep-dives and system architecture. We strive to make Cloud TPUs - Google's advanced AI infrastructure - universally useful and accessible.

      Related Article

      Google showcases Cloud TPU v4 Pods for large model training

      Google’s MLPerf v1.1 Training submission showcased two large (480B & 200B parameter) language models using publicly available Cloud TPU v...

      Read Article



    Google has many products and the following is a list of its products: Android AutoAndroid OSAndroid TVCalendarCardboardChromeChrome EnterpriseChromebookChromecastConnected HomeContactsDigital WellbeingDocsDriveEarthFinanceFormsGboardGmailGoogle AlertsGoogle AnalyticsGoogle Arts & CultureGoogle AssistantGoogle AuthenticatorGoogle ChatGoogle ClassroomGoogle DuoGoogle ExpeditionsGoogle Family LinkGoogle FiGoogle FilesGoogle Find My DeviceGoogle FitGoogle FlightsGoogle FontsGoogle GroupsGoogle Home AppGoogle Input ToolsGoogle LensGoogle MeetGoogle OneGoogle PayGoogle PhotosGoogle PlayGoogle Play BooksGoogle Play GamesGoogle Play PassGoogle Play ProtectGoogle PodcastsGoogle ShoppingGoogle Street ViewGoogle TVGoogle TasksHangoutsKeepMapsMeasureMessagesNewsPhotoScanPixelPixel BudsPixelbookScholarSearchSheetsSitesSlidesSnapseedStadiaTilt BrushTranslateTravelTrusted ContactsVoiceWazeWear OS by GoogleYouTubeYouTube KidsYouTube MusicYouTube TVYouTube VR


    Google News
    TwitterFacebookInstagramYouTube



    Think with Google
    TwitterFacebookInstagramYouTube

    Google AI BlogAndroid Developers BlogGoogle Developers Blog
    AI is Artificial Intelligence


    Nightmare Scenario: Inside the Trump Administration’s Response to the Pandemic That Changed. From the Washington Post journalists Yasmeen Abutaleb and Damian Paletta - the definitive account of the Trump administration’s tragic mismanagement of the COVID-19 pandemic, and the chaos, incompetence, and craven politicization that has led to more than a half million American deaths and counting.

    Since the day Donald Trump was elected, his critics warned that an unexpected crisis would test the former reality-television host - and they predicted that the president would prove unable to meet the moment. In 2020, that crisis came to pass, with the outcomes more devastating and consequential than anyone dared to imagine. Nightmare Scenario is the complete story of Donald Trump’s handling - and mishandling - of the COVID-19 catastrophe, during the period of January 2020 up to Election Day that year. Yasmeen Abutaleb and Damian Paletta take us deep inside the White House, from the Situation Room to the Oval Office, to show how the members of the administration launched an all-out war against the health agencies, doctors, and scientific communities, all in their futile attempts to wish away the worst global pandemic in a century...


    GoogBlogs.com
    TwitterFacebookInstagramYouTube



    ZDNet » Google
    TwitterFacebookInstagramYouTube



    9to5Google » Google
    TwitterFacebookInstagramYouTube



    Computerworld » Google
    TwitterFacebookInstagramYouTube

    • The missing piece in Google's Pixel puzzle Fri, 13 May 2022 02:45:00 -0700

      All right, stop me if you've heard this before: Google's about to get serious about hardware.

      Yeah, yeah — I know. I'll pause for a second while you regain your composure.

      Look, I'm a huge fan of what Google's trying to do with its Pixel products. If you've read my ramblings for long (or seen the NSFW multicolored "P"-logo tattoos on various parts of my person), you know how I feel about the Pixel's place in the Android ecosystem and the critical role it plays. (Just kidding about the tattoos, by the way.) (For now.)

      But the truth is that we've been hearing the "Google's about to get serious about hardware" line for a long time now — over and over and over again. At a certain point, you've gotta ask: "Uh, gang? When is this actually starting?!"

      To read this article in full, please click here

    • Why Apple needs to evict old and unsupported App Store apps Tue, 10 May 2022 09:08:00 -0700

      Apple’s recently announced plan to get rid of unloved older apps from the App Store may have annoyed some developers, but with more than 1 million abandoned apps littered across Google's and Apple’s App Stores, the evidence supports the decision.

      What Apple said about its plans

      In an April note to developers, Apple warned that it intends to begin removing old apps that have not been updated for three or more years and have seen few downloads in the preceding 12 months.

      “We are implementing an ongoing process of evaluating apps, removing apps that no longer function as intended, don’t follow current review guidelines, or are outdated,” the company said.

      To read this article in full, please click here

    • 3 clever new tricks to turn Google Docs into a collaboration superhub Tue, 10 May 2022 03:00:00 -0700

      Google's annual I/O developers' conference kicks off on Wednesday, and we're sure to see all sorts of intriguing new stuff across the entire suite of Google services.

      Here's a little secret, though: You don't have to wait 'til then to find something new and useful. Google rolls out game-changing additions to its apps and products almost constantly, all year long. Most of the goodies just show up with surprisingly little fanfare and end up getting lost in the shuffle.

      That's why today, as we sit patiently and twiddle our collective thumbs ahead of Google's big ol' honkin' announcement extravaganza, I want to draw your attention to a series of spectacular additions in the oft-dusty Google Docs domain. These new features quietly crept into the software over the past several weeks, but most mere mortals would have no way of even knowing.

      To read this article in full, please click here

    • Google, others adding office space in anticipation of the great return Mon, 09 May 2022 03:00:00 -0700

      Since January 2020, Google’s parent company Alphabet has spent nearly $100M on expanding its U.S. commercial real estate portfolio, including a $28.5 million office it bought in Sunnyvale, CA. at the height of the pandemic.

      More recently, Alphabet announced in January it would spend $1 billion for a campus-like office setting in London.

      “We'll be introducing new types of collaboration spaces for in-person teamwork, as well as creating more overall space to improve wellbeing,” Ronan Harris, managing director of Google UK wrote in a blog post. “We’ll introduce team pods, which are flexible new space types that can be reconfigured in multiple ways, supporting focused work, collaboration or both, based on team needs. The new refurbishment will also feature outdoor covered working spaces to enable work in the fresh air.”

      To read this article in full, please click here

    • Apple employees revolt against mandatory back-to-work policy Fri, 06 May 2022 03:00:00 -0700

      A group of Apple employees is pushing back against a mandate by the company requiring them to return to the office three days a week. The group, which calls itself “Apple Together,” published an open letter to executives criticizing the company’s Hybrid Work Pilot program, characterizing it as inflexible.

      Among other grievances, the anonymous letter called the company’s requirement that employees spend three days in the office as showing “almost no flexibility at all.”

      "Office-bound work is a technology from the last century, from the era before ubiquitous video-call-capable internet and everyone being on the same internal chat application," the letter says. "But the future is about connecting when it makes sense, with people who have relevant input, no matter where they are based.

      To read this article in full, please click here

    • 7 hidden Pixel shortcuts you probably aren't using Fri, 06 May 2022 02:45:00 -0700

      We're likely just days away from the launch of Google's latest Pixel phone — the potentially pivotal Pixel 6a midranger. So it seems safe to say the subject of Googley phones is gonna be comin' up a bunch in the weeks ahead, with snazzy new hardware being the main theme of the moment.

      The nice thing about Pixel phones, though, is that you don't have to have the latest and greatest model in order to find some fantastically useful new tricks. Google's constantly updating its Pixels with features both big and small, and it's all too easy for some of the more subtle touches to get lost in the shuffle.

      To read this article in full, please click here

    • Google acquires Raxium in augmented reality push Thu, 05 May 2022 03:58:00 -0700

      Google has acquired Raxium, a five-year-old Bay Area startup working on microLED display technologies for wearables and augmented and virtual reality (AR and VR) headsets.

      “Raxium’s technical expertise in this area will play a key role as we continue to invest in our hardware efforts,” Rick Osterloh, senior vice president of devices and services at Google, wrote in a blog post. The Raxium team will immediately join Google’s devices and services team.

      The financial terms for the deal were undisclosed, but could be as much as $1 billion according to earlier reports by The Information.

      To read this article in full, please click here

    • Microsoft Edge has edged out Apple's Safari in browser popularity Thu, 05 May 2022 03:33:00 -0700

      Microsoft Edge has passed Apple's Safari to become the world's second most popular desktop browser based on data provided by web analytics service StatCounter.

      In February, Microsoft Edge was on the cusp of catching Safari with less than a half percentage point (9.54% to 9.84%) between the two browsers in terms of popularity among desktop users. StatCounter's latest figures show Edge is now used on 10.07% of desktop computers worldwide, 0.46% ahead of Safari; the latter dropped to 9.61%.

      Google Chrome still holds the top spot by a long shot, at 66.58% of all desktop users. And Mozilla's Firefox had just  7.87% of the share, a significant drop from the 9.18% share it held in February. The new data was first reported by MacRumors.

      To read this article in full, please click here

    • Google offers US businesses $100,000 worth of digital skills training Tue, 03 May 2022 03:55:00 -0700

      Google has announced plans to provide $100,000 worth of Google Career Certificates to US-based businesses that want to train their employees in data analytics, digital marketing, IT support, project management, or user experience (UX) design.

      Eligible organizations can apply for up to 500 scholarships each in a variety of digital skills. Google says that no previous experience is required and credentials can be earned over a period of either three or six months of part-time study.

      To read this article in full, please click here

    • Download: UEM vendor comparison chart 2022 Tue, 03 May 2022 03:00:00 -0700

      Unified endpoint management (UEM) is a strategic IT approach that consolidates how enterprises secure and manage an array of deployed devices including phones, tablets, PCs, and even IoT devices.

      To read this article in full, please click here

    • 3 invisible reasons to get excited about Android 13 Wed, 27 Apr 2022 03:00:00 -0700

      Brace yourself, my fellow Android-adoring animal: You're about to experience a whirlwind of conflicting emotions. Ready?

      First things first: Google just launched the first official beta version of this fall's Android 13 update! If you've got a current Pixel phone, that means you can download it onto your device this very second and see all the latest and greatest stuff Google's got cookin' up for our future. (Yay!)

      Now, for the twist: Unlike most Android betas, this inaugural Android 13 beta release still doesn't have most of the software's key features. It's focused primarily on foundational elements and under-the-hood improvements, and outwardly, it's pretty darn similar to the previews that preceded it. Honestly, it's almost more like another developer preview than a beta — at least, in typical Android terms. (Aww...)

      To read this article in full, please click here

    • How the Pixel 6a could completely reshape Android Wed, 20 Apr 2022 03:00:00 -0700

      Talk to most Android enthusiasts about their current causes of excitement, and odds are, the Pixel 6a won't be high on the list.

      It's no wonder, really: The Pixel 6a — Google's upcoming midrange phone model, which signs increasingly suggest should be landing around the time of the company's I/O developers' conference in May — will almost certainly just be a lower-end equivalent of the more premium Pixel 6 flagship that came out last fall.

      And hey, the higher-end flagship phones are where all the truly exciting stuff happens — right? Especially at a time when pretty much every device-maker seems to be working on whiz-bang gizmos that bend, fold, and occasionally perform jaunty little jigs for your amusement, a decidedly mundane midrange model is hardly cause for exhilaration. RIGHT?!

      To read this article in full, please click here

    • 4 buried treasures that'll transform your Chrome OS experience Fri, 08 Apr 2022 02:55:00 -0700

      All right, my fellow Chrome OS adventurer — see if this progression feels familiar:

      • First: "Ooh, look! My Chromebook's getting a huge update this week! Cool new features! SO EXCITING!!"
      • Then: "Oh. The update's here, but everything looks the same. Where's all the new stuff? So disappointing."

      It's an all-too-common pairing here in the land o' Googley matters, and that exact progression is in the midst of playing out for many an eager Chromebook owner this very moment.

      Have you felt it yet? This latest up-and-down got going when Google made a splashy announcement about its 100th Chrome OS release the other day — including, most notably, the long-in-progress launch of a snazzy new revamped Chromebook launcher (ooh, ahh, etc).

      To read this article in full, please click here

    • Android 12 Upgrade Report Card: What a weird year Tue, 05 Apr 2022 03:00:00 -0700

      In the world of software, six months is an eternity.

      Heck, look at how much has happened over the past six months since Android 12 came into the universe. Google started and then finished a hefty 0.1-style update that lays the groundwork for significant large-screen improvements to the Android experience. And it's now well into the public development phase of its next big Android version, Android 13 — which is the rapidly forming release on most folks' minds at this point.

      To read this article in full, please click here

    • Android tablets? Chromebook tablets? How Google thinks both can thrive Wed, 30 Mar 2022 03:00:00 -0700

      Lemme let you in on a little secret: Google isn't your average software company.

      Understatement of the century, I know, right? But it's tough to talk about the ever-shifting intersection of Android and Chrome OS without first putting that out there.

      I mean, think about it: For just over a decade now, Google's been simultaneously developing and promoting two totally separate but increasingly overlapping paths for experiencing the best that its apps and services have to offer.

      You know the deal: On the one side, you've got Android — the go-to platform for touch-centric mobile products. And on the other, there's Chrome OS — the once-barebones computer framework that's grown into a powerful and platform-defying "Everything"-level operating system.

      To read this article in full, please click here

    • 8 hidden Pixel features for smarter calling Fri, 25 Mar 2022 03:00:00 -0700

      Pixel phones are filled with efficiency-enhancing Google intelligence, and one area that's all too easy to overlook is the way the devices can improve the act of actually talking on your cellular telephone.

      Talking on your phone, you say? What is this, 1987?! Believe me, I get it, Vanilli. We're all perpetually busy creatures these days, and the timeless art of speaking to another human on your mobile device can seem both archaic and annoying.

      But hear me out: Here in the real world, placing or accepting (or maybe even just avoiding) a good old-fashioned phone call is occasionally inescapable. That's especially true in the world of business, but it's also apparent in other areas of life — from dialing up a restaurant to confirm your carnitas are ready to dodging your Uncle Ned's quarterly regards-sending check-ins. (No offense, Ned. I never dodge your calls. Really. Send my regards to Aunt Agnes.)

      To read this article in full, please click here

    • A fascinating twist at the intersection of Android and Chrome OS Wed, 23 Mar 2022 03:00:00 -0700

      My, oh my, we sure have come a long way in these Google-scented quarters of ours.

      'Twas not so long ago, after all, when everyone and their granpappy was completely convinced that Google had to be working on combining its two primary platforms — y'know, those silly li'l things we like to call Chrome OS and Android.

      For years, we heard endless declarations about how Chrome OS was a dead man walking and how Android would somehow swallow it up to form a magically merged new mega-operating-system (mmm...tastes Chromey). The reality, of course, was much more nuanced: Google's kept both Chrome OS and Android around and simply worked little by little to align the two entities and make 'em more complementary, consistent, and connected. (Huh. Who woulda thunk?!)

      To read this article in full, please click here

    • Google Sheets cheat sheet: How to get started Tue, 22 Mar 2022 03:00:00 -0700

      Google Sheets is a powerful spreadsheet app that you use through your web browser. It stores your spreadsheets in the cloud with Google Drive. Anyone with a Google account can use Sheets and Drive for free. Both are also part of Google Workspace  (formerly G Suite), Google’s subscription office suite for business and enterprise customers.

      This guide will teach you how to start a new spreadsheet in Sheets or upload one you already have stored on your PC, including a Microsoft Excel spreadsheet. It also goes over the basic interface and unique features of Sheets, such as how to share your spreadsheets and collaborate on them with others.

      To read this article in full, please click here

    • Apple supplier Foxconn halts production amid COVID-19 outbreak in China Mon, 14 Mar 2022 11:02:00 -0700

      Some of the world’s largest tech manufacturers have temporarily shuttered operations in China amid government lock downs in multiple provinces and cities meant to stem the spread of a COVID-19 outbreak.

      As part of its zero-tolerance goal to suppress COVID-19’s spread, the Chinese government ordered lockdowns in several places, including Shanghai and Shenzhen, a major container port and tech hub. Residents in cities under lockdown can only leave their homes for essential reasons.

      Foxconn, the world's biggest contract electronics manufacturer and Apple’s largest iPhone supplier, said in a statement today it will suspend production at its Longhua and Guanlan factories in Shenzhen. Those factories will remain closed “until further notice from the local government.

      To read this article in full, please click here

    • Tech giants move to reopen offices, but differ on hybrid-work plans Wed, 09 Mar 2022 03:01:00 -0800

      Apple, Google, and Twitter have now set dates to reopen their offices, signalling both the end of widespread home working and the rollout of “hybrid” remote policies.

      Like most organizations, each company shuttered corporate offices in accordance with social distancing measures in the early stages of the COVID-19 pandemic in 2020, shifting quickly to support remote work. 

      Now, having already delayed reopening several times due to the ongoing pandemic, the three tech bigwigs have outlined their respective plans to open offices again, adopting hybrid policies that combine remote and in-office work for staff. How those plans work out could shape what other, smaller companies do to deal with changing workplace expectations.

      To read this article in full, please click here



    Pac-Man Video Game - Play Now

    A Sunday at Sam's Boat on 5720 Richmond Ave, Houston, TX 77057