Google Fi SIM Card Kit. Choose between the Simply Unlimited, Unlimited Plus and Flexible plans based on your data usage. 4G LTE and nationwide 5G coverage included for compatible phones.

Google LLC is an American multinational technology company that specializes in Internet-related services and products, which include online advertising technologies, a search engine, cloud computing, software, and hardware. Google was launched in September 1998 by Larry Page and Sergey Brin while they were Ph.D. students at Stanford University in California. Some of Google’s products are Google Docs, Google Sheets, Google Slides, Gmail, Google Search, Google Duo, Google Maps, Google Translate, Google Earth, and Google Photos. Play our Pac-Man videogame.

Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford University in California. The project initially involved an unofficial "third founder", Scott Hassan, the original lead programmer who wrote much of the code for the original Google Search engine, but he left before Google was officially founded as a company. Read the full story...
Clothing & Jewelry —— Cellphones —— Microsoft Products —— All Products

Google Blog

  • Google and U.S. developers find agreement over Google Play store Sat, 30 Jul 2022 23:55:00 +0000

    The Android app economy has helped create nearly two million American jobs; developers around the world have earned more than $120 billion using the Google Play Store. We’re proud that Google Play helps developers build great apps and rewards them for doing so. And we know that a successful ecosystem must benefit both developers and consumers, which is why we have rules of the road to keep the store secure, protect privacy and prevent fraud. While we strive to make Google Play the best platform for everyone, Android also provides consumers and developers the opportunity to use other app store options.

    Today, we’re pleased to share a proposed agreement that will help ensure that both developers and consumers can continue to benefit from Google Play. Google and a group of U.S. developers have reached a proposed settlement that allows both parties to move forward and avoids years of uncertain and distracting litigation.

    As part of the settlement, we’re establishing a $90 million fund to support U.S. developers who earned two million dollars or less in annual revenue through Google Play during each year from 2016-2021. A vast majority of U.S. developers who earned revenue through Google Play will be eligible to receive money from this fund, if they choose. If the Court approves the settlement, developers that qualify will be notified and allowed to receive a distribution from the fund.

    In addition to the fund, we’re committing to maintain a number of existing practices and implement new benefits that help developers innovate and communicate with their users:

    • To continue to provide developers with a tiered pricing model, we’ll maintain Google’s 15% commission rate for the first $1 million in annual revenue earned from the Google Play Store for U.S. developers, which we implemented in 2021.
    • We’re revising our Developer Distribution Agreement to make it clear that developers can continue to use contact information obtained in-app to communicate with users out-of-app, including about subscription offers or lower-cost offerings on a rival app store or the developer’s website.
    • In new versions of Android, Google will maintain certain changes implemented in Android 12 that make it even easier for people to use other app stores on their devices, while being careful not to compromise the safety measures Android has in place.
    • To showcase independent and small startup developers building unique high-quality apps, we’re creating an “Indie Apps Corner” that will appear on the apps tab on the U.S. Google Play homepage and shine a spotlight on these developers.

    These commitments, including the $90 million fund, build on a number of ways we already support developers, such as providing tools that help developers build great apps, lower their costs, and grow their businesses. In fact, compared to other prominent digital content stores, we provide developers more ways to interact with their customers.

    Finally, we’ve heard developers want to understand more about how Google Play operates, which is why we’ve agreed to publish annual transparency reports. The reports will share information about the Google Play Store, including statistics such as apps removed from Google Play, account terminations, and other data regarding how users interact with Google Play.

    We’re pleased that we worked with the developers to propose this agreement for the Court’s approval. As the agreement notes, we remain confident in our arguments and case, but this settlement will avoid protracted and unnecessary litigation with developers, whom we see as vital partners in the Android ecosystem. We remain steadfast in our commitment to building thriving, open platforms that empower consumers and help developers succeed.

  • New Google Workspace features to help solo business owners Thu, 30 Jun 2022 17:30:00 +0000

    Over the past few years, we’ve seen more people forging their own path and turning their personal passions into businesses. These individual business owners, sometimes called “solopreneurs,” wear many hats as they run and grow their businesses: salesperson, marketer, accountant, the list goes on.

    That’s why one year ago, we launched Google Workspace Individual as a new offering to help these solo business owners grow their businesses with the familiar apps they’re likely already using in their personal life. We’ve heard from customers that Google Workspace Individual helps them focus their time on doing what they love — like meeting with customers and designing personalized services — and less time on recurring tasks like scheduling appointments and sending emails. Since launch, we’ve delivered a number of improvements to provide even more value to customers, and today we’re announcing what’s coming next – electronic signatures right within Google Docs.

    Coming soon: Easily sign important documents right in Google Docs

    Whether you’re an event planner or digital creator, it can be a challenge to stay on top of contracts and customer agreements that need to be signed as you’re constantly context switching and jumping between different apps to get work done. That’s why we’re natively integrating eSignature in Google Docs, so you can quickly execute agreements from the familiar interface of Docs without having to switch tabs or apps.

    Animation of the process of inserting electronic signature fields in Google Docs

    Coming soon: Easily request electronic signatures directly in Google Docs

    eSignature in Google Docs will take advantage of the same secure-by-design infrastructure and built-in protections Google uses to help secure your information and safeguard your privacy. Let’s take a look at how eSignature can help you create agreements:

    • Collaborate in documents: Collaborate on changes directly in Google Docs with comments and suggestions — no need to export the file to send a draft contract over email.
    • Add fields to documents: Within the familiar Google Docs interface, you can easily drag and drop signature and date fields in branded documents you create.
    • Request a signature: Once you resolve all comments and suggestions, requesting a signature is as easy as sharing a file in Drive.
    • Add signatures: When ready to sign, the signee can easily add their signature, no downloads needed. Once the signature is added, a completed PDF contract is emailed to both parties.
    • Monitor and track progress: Quickly see the status of pending signatures and easily find completed, signed contracts.
    • Create copies of contracts: For signature workflows that need to be repeated regularly, you can streamline the process by creating copies of existing contracts and then modifying as needed.

    eSignature in Google Docs is coming soon in Beta to Google Workspace Individual users and is the latest in a series of improvements we’ve announced for the subscription in the past year. If you’re already using a dedicated eSignature solution, Google Workspace integrates with a number of leading providers. Learn more about how these eSignature and other integrations can help you optimize your workspace on our blog post.

    ICYMI: Google Workspace Individual updates from this past year

    Email marketing updates for engaging campaigns

    For any business, it’s vital to connect with customers and prospects, both on a one-to-one basis and at a large scale. Google Workspace Individual makes it easy to do both, so you can easily send communications like monthly newsletters and also offer items like scheduled consultations.

    Animation of the process of creating  and sending customized marketing emails from Gmail

    Create and send customized marketing emails from Gmail

    To help you reach many customers at once, last year we added a way to run simple email campaigns directly in Gmail. We started first by providing professionally designed templates that you can customize with your own branding and images in just a few clicks. Then earlier this year, we added multi-send, which allows you to deliver individual emails to a large number of recipients with a unique unsubscribe link for each recipient. With the combination of these improvements, it’s easy to make communications as targeted as you like, because you can create multiple email mailing lists within Google Contacts for different audiences and easily tailor the message to each audience. Gmail layouts and multi-send are generally available in Google Workspace Individual today.

    Appointment scheduling updates for easier bookings

    For scheduling in-person appointments or virtual meetings, Google Calendar helps streamline the appointment scheduling process and avoid back-and-forth communication to find a time that works. Since launching, we’ve made a number of enhancements that improve the experience for both the business owner and scheduler, including the ability to:

    • Help prevent no-shows by customizing the timing of reminder emails and having users verify their email before booking for added security.
    • Reflect your operational needs by setting flexible appointment durations, adding buffer time between appointments and limiting the number of bookings per day.
    • Easily update your availability with one-off exceptions like regional holidays and customizable start and end dates.
    Animation of creating a shareable appointment schedule that clients can use to book appointments online by setting your availability and appointment offerings directly in Google Calendar.

    Get your own professional booking page that stays up to date

    Customized appointment scheduling with the above features are generally available in Google Workspace Individual today, on the web and your mobile device.

    Google Meet updates for your customer and partner calls

    Once an appointment is on the books and it’s time to connect, Google Meet provides an easy way for you to deepen customer and partner relationships through secure video meetings. Helpful features in Meet ensure you can be clearly seen and heard. Noise cancellation removes background distractions like barking dogs, while low-light mode automatically adjusts your video in poorly lit settings. Here are a few notable Meet announcements from this past year:

    • Mimic taking your call from a real-life cafe or condo with immersive backgrounds.
    • Filter out the echoes in spaces with hard surfaces so that you can have conference-room audio quality whether you’re in a basement, a kitchen, or a big empty room.
    • Clearly see participants on a call while you’re presenting or multi-tasking with picture-in-picture on Chrome browsers.
    • Review your forecast or business proposal with meetings directly in Docs, Sheets and Slides.
    Animation of joining a Google Meet video call directly from Google Docs.

    Quickly join a Google Meet call from Google Docs, Sheets and Slides

    Sign up today to take advantage of promotional pricing

    Save 20% until October 2022[3bdee8]when you sign up for Google Workspace Individual today or learn more about Google Workspace Individual on our website.

  • Google for Mexico: Economic recovery through technology Thu, 30 Jun 2022 17:00:00 +0000

    During the pandemic, different technological tools allowed us to stay connected, collaborate and find the best responses to overcome the challenges in front of us.

    As we move forward, we want to become Mexico's trusted technology ally and contribute to the country with programs, products and initiatives that promote economic, social and cultural development. Today, at our second Google for Mexico event, we aim to accelerate the country's economic recovery, helping people find more and better jobs, making it easier for businesses to grow, reduce the gender gap and promote financial inclusion.

    Improving Mexicans’ lives through technology

    In collaboration with the Ministry of Public Education, we helped students across the country to continue their school year by providing more than 20 million free Google for Education accounts. We have trained more than 1.9 million people in Mexico through Grow with Google and Google.org grants. And we have worked together with the Ministry of Tourism to create a joint strategy to digitize the travel sector, and partnered with the Ministry of Economy on gender gap reduction projects and a technological innovation program for manufacturing companies in the southeast region of the country.

    According to a study we conducted with AlphaBeta, in 2021 we estimated that companies in the country obtained annual economic benefits worth more than $7.7 billion dollars from Google products (Google Search & Ads, AdSense, Google Play and YouTube), approximately three times the impact in 2018 ($2.3 billion dollars).

    Today, more people in the world are using their smartphones to save credit and debit cards and to buy new things. Over the last few years, we have seen rapid digitization of essentials that we carry with us every day, such as car keys, digital IDs and vaccine records.

    That’s why we are announcing that Mexico is part of the global launch of Google Wallet on Android and Wear OS. Google Wallet will initially launch with support for payment cards and loyalty passes and eventually expand to new experiences like transit and event tickets, boarding passes, car keys and digital IDs.

    $10 million from Google.org

    Mexico's Southeast region is home to more than 50% of the country's indigenous population; it is also a place affected by poverty and with big social vulnerability. Google.org, the philanthropic arm of Google, is allocating $10 million — the largest amount of funding provided by the organization in the country — to this region’s transformation. This initiative will mostly benefit women during the upcoming three years, supporting programs focused on promoting economic opportunities that accelerate financial inclusion, reducing the gender gap.

    A Mexican woman wearing a red dress with a white ruffle stands in front of hills, looking slightly away from the camera.

    Women from Mexico's Southeast region will benefit from Google.org 10 million dollars fund through local and regional NGOs.

    Technology as a booster for jobs

    In 2019, during the first edition of Google for Mexico, we announced the launch of Google Career Certificates alongside a grant of $1.1 million for International Youth Foundation Mexico (IYF). Through this grant, IYF has trained 1,200 young people. Seventy percent of the graduates managed to get a new job, while the participants who were already employed raised their income by more than 30%. To expand this initiative, and as part of the $10M fund to support Mexico’s Southeast region, we are announcing a $2 million grant to support IYF to take their project into the region and train 2300 women from the community.

    Supporting the news industry

    In late 2020, we launched Google News Showcase, an initiative that offers a better experience for readers and news editors. Google News Showcase is a licensing program to pay publishers for high-quality content. This program will help participating publishers monetize their content through an enhanced storytelling experience that lets people go deeper into more complex stories and stay informed about different issues and interests.

    Today we are announcing the beginning of negotiations with local media to soon launch a News Showcase in México. We are excited to continue contributing to the country’s media ecosystem, and offer our users relevant, truthful and quality information on local, national and international news.

    Illustration of a finger swiping through Google News panels on a screen

    Google News Showcase will bring a better experience for readers and news publishers in Mexico.

    Preserving and promoting native languages

    Every 14 days, a language becomes extinct. This means that out of the 7,000 existing tongues in the world, more than 3,000 are in danger of vanishing. To support the efforts of groups dedicated to language preservation, Google Arts & Culture is collaborating with partners around the world to launch Woolaroo, an experiment that uses machine learning to identify objects and show them in native languages.

    Through their mobile cameras, users can take a photo or check their surroundings to receive a translation, and its correct pronunciation. In the beginning, Woolaroo could do this in 10 languages, and today seven more have been added, including Maya and Tepehua.

    Animated GIF of a hand holding a phone that shows nature pictures that reflect the background.

    Woolaroo, a language preservation experiment powered by machine learning, will include ancestral languages Maya and Tepehua.

    At Google, we believe technology is the fuel to be helpful for Mexicans across the country, providing intelligent solutions for millions of people.

  • Mahima Pushkarna is making data easier to understand Thu, 30 Jun 2022 16:00:00 +0000

    Five years ago, information designer Mahima Pushkarna joined Google to make data easier to understand. As a senior interaction designer on the People + AI Research (PAIR) team, she designed Data Cards to help everyone better understand the contexts of the data they are using. The Data Cards Playbook puts Google’s AI Principles into practice by providing opportunities for feedback, relevant explanations and appeal.

    Recently, Mahima’s paper on Data Cards (co-written with Googlers Andrew Zaldivar and Oddur Kjartansson) was accepted to the ACM Conference on Fairness, Accountability and Transparency (ACM FAccT). Let’s catch up with her and find out more about what brought her to Google.

    How did your background lead you to the work you’re doing now?

    I've always been fascinated by conjuring up solutions to things. The kind of questions that I’ve found meaningful are those that are never truly solved, or never have one correct answer. (The kind of questions that exasperate us!) Those have been the problems I am always drawn towards.

    Early in my career, I realized the power in visualizing data, but spreadsheets were intimidating. I wondered how design could make communicating complexity easier. So I found myself in grad school in Boston studying information design and data visualization. I focused on how people experience data and how our relationships to each other and our contexts are mediated.

    I joined Google Brain as the first visual designer in a full-time capacity, though I had no background in artificial intelligence or machine learning — this was the deep end of the pool. This opened up the space to explore human-AI interaction, and make AI more accessible to a broader class of developers. At PAIR, my work focuses on making information experiences more meaningful for developers, researchers and others who build AI technologies.

    What’s it like to have a unique background as a designer on a technical AI research team?

    When you're an engineer and immersed in building technology, it's easy to assume everyone has a similar experience to your own — especially when you’re surrounded by peers who share your expertise. The actual user experience is very personal and varies drastically across users and contexts. That particular clarity is what designers bring to the table.

    I’ve been able to engage my engineering and research colleagues with simple, people-centered questions right in the very beginning. How are people using an AI tool? What are they learning from it? Who else might be involved in the conversation? Do they have the proficiency we assume they have?

    Pull quote: “Identifying what we don’t know about data is just as important as articulating what we do know.”

    How did you begin designing Data Cards?

    This project started when I was working on another visualization toolkit, Facets, to communicate the skews and imbalances within datasets to help machine learning practitioners make informed decisions. At the time, transparency was a moving target. Andrew, Tulsee Doshi and I started to proactively think about fairness in data, and saw a huge gap in the documentation of human decisions that dot a dataset's lifecycle.

    This “invisible” information shapes how we use data and the outcomes of models trained on them. For example, a model trained on a dataset that captures age in just two or three buckets will have very different outcomes compared to a dataset with ten buckets. The goal of Data Cards is to make both visible and invisible information about datasets available and simple to understand, so people from a variety of backgrounds can knowledgeably make decisions.

    As we cover in our FAccT paper, Andrew and Oddur and I arrived at two insights. The first is that identifying what we don’t know about data is just as important as articulating what we do know. In capturing these nuances, it is possible to narrow those knowledge gaps before even collecting data. The second thing that surprised us was the sheer number of people involved in a dataset’s life cycle, and how fragile knowledge is. Context is easily lost in translation both between and within teams, across documents, emails, people and time.

    Data Cards stand on the shoulders of giants, like Data Sheets (Gebru, et al.) and Model Cards (Mitchell et al.). We've been immensely lucky to have had the support of many original authors on these seminal papers that have paved our path to FAccT.

    How do you hope the paper is used across the tech industry?

    Imagine a world in which finding verifiable information about the motivations of a dataset’s creators or performance of a model is as easy as learning about the ethical beliefs of a celebrity or the rating of a movie. Our vision for Data Cards is that they become a cultural mainstay — invisible, but their absence would be missed by ML practitioners.

    In this paper, we introduce frameworks that other teams can use in their work. Alongside that, we’ve open-sourced the Data Cards Playbook, so we're trying to lower the barrier to access in every way possible.

  • Countering hack-for-hire groups Thu, 30 Jun 2022 16:00:00 +0000

    As part of TAG's mission to counter serious threats to Google and our users, we've published analysis on a range of persistent threats including government-backed attackers, commercial surveillance vendors, and serious criminal operators. Today, we're sharing intelligence on a segment of attackers we call hack-for-hire, whose niche focuses on compromising accounts and exfiltrating data as a service.

    In contrast to commercial surveillance vendors, who we generally observe selling a capability for the end user to operate, hack-for-hire firms conduct attacks themselves. They target a wide range of users and opportunistically take advantage of known security flaws when undertaking their campaigns. Both, however, enable attacks by those who would otherwise lack the capabilities to do so.

    We have seen hack-for-hire groups target human rights and political activists, journalists, and other high-risk users around the world, putting their privacy, safety and security at risk. They also conduct corporate espionage, handily obscuring their clients’ role.

    To help users and defenders, we will provide examples of the hack-for-hire ecosystem from India, Russia, and the United Arab Emirates and context around their capabilities and persistence mechanisms.

    How Hack-For-Hire Operations Work

    The hack-for-hire landscape is fluid, both in how the attackers organize themselves and in the wide range of targets they pursue in a single campaign at the behest of disparate clients. Some hack-for-hire attackers openly advertise their products and services to anyone willing to pay, while others operate more discreetly selling to a limited audience.

    For example, TAG has observed Indian hack-for-hire firms work with third party private investigative services — intermediaries that reach out for services when a client requires them — and provide data exfiltrated from a successful operation. This is detailed in depth in today’s Reuters investigation into the Indian hack-for-hire ecosystem. We have also observed Indian hack-for-hire firms work with freelance actors not directly employed by the firms themselves.

    The breadth of targets in hack-for-hire campaigns stands in contrast to many government-backed operations, which often have a clearer delineation of mission and targets. A recent campaign from an Indian hack-for-hire operator was observed targeting an IT company in Cyprus, an education institution in Nigeria, a fintech company in the Balkans and a shopping company in Israel.

    Recent Hack-for-Hire Campaigns


    Since 2012, TAG has been tracking an interwoven set of Indian hack-for-hire actors, with many having previously worked for Indian offensive security providers Appin and Belltrox.

    One cluster of this activity frequently targets government, healthcare, and telecom sectors in Saudi Arabia, the United Arab Emirates, and Bahrain with credential phishing campaigns. These credential phishing campaigns have ranged from targeting specific government organizations to AWS accounts to Gmail accounts.

    Sample AWS Phishing Email

    Sample AWS phishing email

    Sample AWS phishing page

    Sample AWS phishing page

    TAG has linked former employees of both Appin and Belltrox to Rebsec, a new firm that openly advertises corporate espionage as an offering on its company website.

    Rebsec’s offerings as per the company’s website

    Rebsec’s offerings as per the company’s website


    While investigating a 2017 credential phishing campaign that targeted a prominent Russian anti-corruption journalist, we discovered the Russian attacker targeting other journalists, politicians across Europe, and various NGOs and non-profit organizations. But what stuck out during this investigation was the breadth of targeting, which also included individuals that had no affiliation with the selected organizations, and appeared to be regular, everyday citizens in Russia and surrounding countries. This hack-for-hire actor has been publicly referred to as 'Void Balaur'.

    These campaigns were similar regardless of target, consisting of a credential phishing email with a link to an attacker-controlled phishing page. The lures ranged from fake Gmail and other webmail provider notifications to messages spoofing Russian government organizations. After the target account was compromised, the attacker generally maintained persistence by granting an OAuth token to a legitimate email application like Thunderbird or generating an App Password to access the account via IMAP. Both OAuth tokens and App Passwords are revoked when a user changes their password.

    Russian hack-for-hire phishing email

    Russian hack-for-hire phishing email

    Russian hack-for-hire phishing site

    Russian hack-for-hire phishing site

    During our early investigation, TAG discovered the attacker’s public website (no longer available) advertising account hacking capabilities for email and social media services. The site claimed to have received positive reviews on Russian underground forums such as Dublikat and Probiv.cc. Over the past five years, TAG has observed the group targeting accounts at major webmail providers like Gmail, Hotmail, and Yahoo! and regional webmail providers like abv.bg, mail.ru, inbox.lv, and UKR.net.

    Pricing list from hacknet-service.com in 2018

    Pricing list from hacknet-service.com in 2018

    United Arab Emirates

    TAG is also tracking a hack-for-hire group now based in the United Arab Emirates that is mostly active in the Middle East and North Africa. They have primarily targeted government, education, and political organizations including Middle East focused NGOs in Europe and the Palestinian political party Fatah. Amnesty International has also reported on their campaigns.

    The group commonly uses Google or OWA password reset lures to steal credentials from targets, often using the MailJet or SendGrid API to send phishing emails. Unlike many hack-for-hire actors that use open source phishing frameworks like Evilginx or GoPhish, this group uses a custom phishing kit that utilizes Selenium, a self described 'suite of tools for automating web browsers.' Previously described by Amnesty, this phishing kit has remained under active development over the past five years.

    Google Security Alert phishing page

    Google Security Alert phishing page

    After compromising an account, the actor maintains persistence by granting themselves an OAuth token to a legitimate email app like Thunderbird, or by linking the victim Gmail account to an attacker-owned account on a third-party mail provider. The attacker would then use a custom tool to download the mailbox contents via IMAP.

    This group also has links to the original developers of H-Worm, also known as njRAT. In 2014, Microsoft filed a civil suit against the developer, Mohammed Benabdellah, for the development and dissemination of H-Worm. Benabdellah, who also goes by the moniker Houdini, has been actively involved in the day-to-day development and operational deployment of the credential phishing capabilities used by this group since its inception.

    Protecting Our Users

    As part of our efforts to combat serious threat actors, we use results of our research to improve the safety and security of our products. Upon discovery, all identified websites and domains were added to Safe Browsing to protect users from further harm. We encourage any high risk user to enable Advanced Protection and Google Account Level Enhanced Safe Browsing and ensure that all devices are updated. Additionally, our CyberCrime Investigation Group is sharing relevant details and indicators with law enforcement.

    TAG is committed to sharing our findings as a way of raising awareness with the security community, and with companies and individuals that might have been targeted. We hope that improved understanding of the tactics and techniques will enhance threat hunting capability and lead to stronger user protections across the industry.

    With contributions from Winnona DeSombre

    Indicators of Compromise

    UAE hack-for-hire Group Domains:

    • myproject-login[.]shop
    • mysite-log[.]shop
    • supp-help[.]me
    • account-noreply3[.]xyz
    • goolge[.]ltd
    • goolge[.]help
    • account-noreply8[.]info
    • account-server[.]xyz
    • kcynvd-mail[.]com
    • mail-goolge[.]com
    • kcynve-mail[.]com

    Indian hack-for-hire Group Domains:

    • dtiwa.app[.]link
    • share-team.app[.]link
    • mipim.app[.]link
    • processs.app[.]link
    • aws-amazon.app[.]ink
    • clik[.]sbs
    • loading[.]sbs
    • userprofile[.]live
    • requestservice[.]live
    • unt-log[.]com
    • webtech-portal[.]com
    • id-apl[.]info
    • rnanage-icloud[.]com
    • apl[.]onl
    • go-gl[.]io

    Russian hack-for-hire Group Domains:

    • login-my-oauth-mail[.]ru
    • oauth-login-accounts-mail[.]ru
    • my-oauth-accounts-mail[.]ru
    • login-cloud-myaccount-mail[.]ru
    • myaccounts-auth[.]ru
    • security-my-account[.]ru
    • source-place-preference[.]ru
    • safe-place-smartlink[.]ru
    • safe-place-experience[.]ru
    • preference-community-place[.]ru
  • Staying safe online with our updated Google Password Manager Thu, 30 Jun 2022 16:00:00 +0000

    Strong, unique passwords are key to helping keep your personal information secure online. That's why Google Password Manager can help you create, remember and autofill passwords on your computer or phone: on the web in Chrome, and in your favorite Android and iOS apps.

    Today we've started rolling out a number of updates that help make the experience easier to use, with even stronger protections built in.

    A consistent look and feel, across web and apps

    We're always grateful for feedback, and many of you have shared that managing passwords between Chrome and Android has been confusing at times: "It's the same info in both places, so why does it look so different?" With this release, we're rolling out a simplified and unified management experience that's the same in Chrome and Android settings. If you have multiple passwords for the same sites or apps, we’ll automatically group them. And for your convenience, you can create a shortcut on your Android home screen to access your passwords with a single tap.

    GIF showing new Google Password Manager shortcut on an Android homescreen.

    You can now add a shortcut to Google Password Manager to your Android homescreen.

    More powerful password protections

    Google Password Manager can create unique, strong passwords for you across platforms, and helps ensure your passwords aren’t compromised as you browse the web. We’re constantly working to expand these capabilities, which is why we’re giving you the ability to generate passwords for your iOS apps when you set Chrome as your autofill provider.

    Image showing how Chrome can automatically generate strong passwords on iOS

    You can now create strong passwords on your computer or mobile, on any operating system.

    Chrome can automatically check your passwords when you enter them into a site, but you can have an added layer of confidence by checking them in bulk with Password Checkup. We’ll now flag not only compromised credentials, but also weak and re-used passwords on Android. If Google warns you about a password, you can now fix them without hassle with our automated password change feature on Android.

    Image showing how the Password Checkup feature flags compromised passwords on Android

    For your peace of mind, Password Checkup on Android can flag compromised, weak and reused passwords.

    To help protect even more people, we’re expanding our compromised password warnings to all Chrome users on Android, Chrome OS, iOS, Windows, MacOS and Linux.

    Simplified access and password management

    Google built its password manager to stay out of your way — letting you save passwords when you log in, filling them when you need them and ensuring they aren’t compromised. However, you might want to add your passwords to the app directly, too. That's why, due to popular demand, we're adding this functionality to Google Password Manager on all platforms.

    GIF showing how you can add your passwords directly on all platforms.

    Adding your passwords directly is now possible on all platforms.

    In 2020, we announced Touch-to-Fill to help you fill your passwords in a convenient and recognizable way. We’re now bringing Touch-to-Login to Chrome on Android to make logging in even quicker by allowing you to securely log in to sites directly from the overlay at the bottom of your screen.

    GIF showing new touch-to-login feature

    Touch-to-Login signs you in directly from a recognizable overlay.

    Many of these features were developed at the Google Safety Engineering Center (GSEC), a hub of privacy and security experts based in Munich, so Guten Tag from the team! Of course, our efforts to create a safer web are a truly global effort – from our early work on 2-step verification, to our future investments in technologies like passkeys – and these updates that we are rolling out over the next months are an important part of that work.

  • How this Google intern is spending her summer Thu, 30 Jun 2022 14:00:00 +0000

    Welcome to the latest edition of “My Path to Google,” where we talk to Googlers, interns, apprentices and alumni about how they got to Google, what their roles are like and even some tips on how to prepare for interviews.

    Today’s post is all about Micka Alencar, an intern from Brazil who’s spending her summer on the Google Cloud team.

    Can you tell us a bit about yourself?

    I study production engineering at the Federal University of São Carlos in Sorocaba, Brazil. This summer, I’m interning for the Google Cloud team! Helping people really energizes me, so I also volunteer with local community projects in my free time, like teaching English classes to children. Outside of that, I like to watch anime and spend time with my family.

    What do you do at Google?

    I’m a Google Cloud Strategy and Sales Operations intern. In this role, I research cloud market trends and look for opportunities to grow our Google Cloud business in Latin America. Right now, I'm working on two main projects: developing a more structured onboarding process for our sales team, and building a framework for measuring our team’s progress.

    Why did you apply to Google?

    I decided to apply for an internship at Google because of how closely the company's values ​​align with mine. I’ve always dreamed of working at a place where I wouldn’t just be a number, but an important part of something bigger. At Google, my work is relevant, I’m heard and I can be myself without any judgment or fear.

    How did your interview process go?

    Google’s interview process was different from what I was used to. Here, you’re evaluated across a broad set of skills, not just your technical abilities. Both of my interviewers were very kind, and they made me feel comfortable from the start. They helped turn that evaluative moment into a pressure-free conversation where I could share my points of view, ideas and, ultimately, who I truly am.

    Micka, in a black dress, poses with three other family members. They are wearing formal clothing and have bouquets of flowers behind them.

    Micka and her family, who she credits with playing a big role in her professional development.

    How did you prepare for your interviews?

    To prepare for my interviews, I reflected on critical moments in my life and career and crafted a narrative around them. I also visited Google’s Careers site for interview tips and did several rounds of mock interviews with my friends.

    How has it been working remotely?

    Amazing! My onboarding process was so well organized, and my team has given me the support I need to succeed in this internship. And even though I’m working remotely, I recently met my team in person at the São Paulo office. This was an important moment for us to get to know each other even better.

    Any advice you’d give to aspiring Google interns?

    First, be authentic and don't try to fit into patterns or create false appearances. At Google, individuality is highly appreciated and is an important part of the candidate selection process. Second, don't diminish yourself in any way — you can be whoever you want to be. And if you think you can work at Google, then you can! And finally, dedicate yourself to your dream and believe strongly in your capabilities.

  • A Queer Eye on Art History with Google Arts & Culture Thu, 30 Jun 2022 13:00:00 +0000

    Editor's Note: In honor of Pride Month and beyond, and in collaboration with over 60 cultural institutions, Google Arts & Culture presents the "A Queer Eye on Art History" hub. It’s a place where you can explore archives and collections to celebrate LGBTQIA+ lives and art and dive into more than 20 newly curated stories, new collections from partners, and much more. Today, one of our partners, Andrew Shaffer, Interim Co-Executive Director, from the GLBT Historical Society, shares his perspective.

    Queer art has a long history. From a leather version of Michelangelo’s David to giant rainbow flags to outrageous drag outfits, queer people have been making art — and queering existing artworks — since time immemorial. The GLBT Historical Society in San Francisco preserves thousands of pieces that document a vast range of queer arts, from sculpture and painting to poetry, and dance.

    Many of these belong to our Art & Artifacts collection, which is something of a cabinet of curiosities, or, as we call them, “queeriosities.” With over 1,000 items, it is one of the world’s largest collections of two and three-dimensional objects that illustrate historical LGBTQIA+ material culture. Our archives hold these artworks along with documents and artifacts that tell the stories of countless LGBTQIA+ lives and communities.

    Our archives hold diaries by gay and trans historian Lou Sullivan; outfits from icons like José Sarria,Sylvester, andGilbert Baker; a rich trove of writing and correspondence from Phyllis Lyon and Del Martin; the suit Harvey Milk was wearing the day of his murder; the only known remnant of the original rainbow flags; and so much more. We keep these objects safe and accessible so current and future generations can learn their history, and find their own place in it.

    Visit Google Arts & Culture’s new hub ‘A Queer Eye on Art History’ to learn more about our stories and more from more than 60 other cultural institutions, such as:

    Uncovering hidden histories

    • Did you know that famous art could be... queer? Discover new perspectives of iconic classical artworks with creators Rainbow History Class as they deep dive into figures and homoerotic motifs found in Renaissance Art.
    • Hear more about the history and symbols behind Pride, or queering the art canon with Youtuber and Author Rowan Ellis. Could retrospectively inserting queerness into the canon give LGBTQIA+ artists their place in the spotlight?

    Amplifying LGBTQIA+ lives and community

    Celebrating queer artists

    • From Rosa Bonheur, Keith Haring and Frida Khalo, to Zanele Muholi and Kehinde Wiley, discover the artists who made an impact to queer art and history.
    • Learn more about the intimate works of Zanele Muholi and how they celebrate South Africa’s Black and queer community, or take a closer look at Kehinde Wiley’s iconic portraits.

    Ready to explore more?

    Visit g.co/prideculture or download Google Arts & Culture’s Android or iOS app. To learn more about the GLBT Historical Society, and support our work, visit glbthistory.org. Happy Pride to everyone!

  • Preserving languages and the stories behind them Thu, 30 Jun 2022 09:00:00 +0000

    To celebrate the first year of UNESCO’s International Decade of Indigenous Languages, seven more indigenous languages are now available on Woolaroo — a Google Arts & Culture experiment that uses machine learning to preserve and help people explore endangered languages.

    On average, a different language becomes extinct every 14 days. And of the 7,000 languages currently spoken around the world, more than 3,000 are under threat of disappearing — along with the rich cultures they represent.

    Thanks to a collaboration with our global partners, ranging from language communities to national language institutes, you can now discover the languages of Maya, Tepehua, Sanskrit, Vurës, Kumeyaay/Diegueño, Potawatomi and Serravallese, spoken across Mexico, South Asia, the South Pacific, the United States and Italy. Simply choose a language, take a picture of an object, and Woolaroo will return the translation for it thanks to the Google Cloud Vision API.

    Discover stories from endangered language speakers

    For the first time on Google Arts & Culture, you can findstories written by the speakers of these languages. In these accounts, they share the cultures they’re connected to and how they’re using technology to promote language learning and preservation.

    Our Potawatomi tribe partner, Justin Neely, is using Woolaroo to promote and preserve the Potawatomi’s language, Bodéwadmimwen, among students and young people. “Words, phrases and verb conjugations show how the Potawatomi see the world — with an emphasis on connection to the earth, a high regard for mother nature and living beings, and a communal lifestyle,” says Neely. Neely felt that Woolaroo would suit children in particular, allowing them to use technology as a way to explore their heritage.

    Explore more languages and communities on Google Arts & Culture, in the iOS or Android app and, and at g.co/woolaroo.

  • 10 reasons to switch to Android Wed, 29 Jun 2022 18:00:00 +0000

    In the last year, over a billion new Android phones were activated. Ready to join the fun, but not sure which phone is best for you? Consider one that’s loaded with the best of Google, that can fold to fit in your pocket or fit your budget, or has a camera that can capture any shot. Regardless of which phone you choose, making the switch from iPhone to Android has never been easier.

    Starting today, support for the Switch to Android app on iOS is rolling out to all Android 12 phones, so you can move over some important information from your iPhone to your new Android seamlessly. Once you’ve got your new Android phone, follow our easy setup instructions to go through the data transfer process. You’ll be prompted to connect your old iPhone with your new Android phone either with your iPhone cable or wirelessly via the new Switch to Android app. The instructions will walk you through how to easily transfer your data like your contacts, calendars and photos over to your new phone.

    Once you’re all set up, you can get started on your new Android device by checking out our favorite features.

    1. Express yourself in new ways: With the Messages app and Gboard, it’s easy and enjoyable to send messages — especially between friends who use Android. Group chats, high-quality photo and video sharing, read receipts and emoji reactions are all available thanks to RCS, and thousands of emoji mashup stickers are there to help you express your feelings. (Rest assured, your iPhone friends will still receive your messages as well.)
    2. Video chat with anyone, anywhere: If your friends and family have Google accounts, it's easier than ever to video chat with Google Meet on Android. Or if you prefer FaceTime, you can still use that in the latest version of Chrome. Or with apps like WhatsApp in Google Play, you can chat with whomever you like for free around the globe. Android has so many options, it’s easy to stay connected with those that matter to you the most.
    3. Tune into your favorite music: Catch up on the latest hits with your preferred streaming service available on Android. And if you had previously purchased and downloaded music on your iPhone, your music will transfer over to your Android phone, as long as it’s digital rights management (DRM)-free. Your purchases and downloaded content from Apple Music will still be accessible on your new Android device by downloading the Apple Music app.
    4. Your favorite apps and more: With Google Play, you’ll find the apps you already use and love, and quickly start to discover so many more. Looking to plan an outdoorsy getaway? Hipcamp will help you book your next camping spot, Skyview Lite will be your stargazing guide to the sky, and AllTrails will help you find a hike that’s perfect for you and your friends. A summer of fun made possible with your new Android.
    5. A privacy-first approach: On your new phone, your data is proactively protected by Android. Android helps defeat bad apps, malware, phishing and spam, and helps keep you one step ahead of threats. Messages, for example, helps protect people against 1.5 billion spam messages per month. Android also provides timely recommendations, like prompting you to select your location-sharing preferences when opening an app to help you make the best decisions for your privacy. Read more about how to keep your data private and secure.
    6. More devices that work better together: Choose from a wide variety of Chromebooks, Wear OS smartwatches, Google TV devices and Fast Pair supported headphones, like Pixel Buds, that work better together with your phone. In fact, some of your Apple products will still work with your Android device, like AirPods.
    7. Get more done with Google apps and services: Traveling on vacation and can’t read the local signs? Scan the text forinstant translation so you can get to your destination quickly. Editing a Google Doc on your laptop, but need to finish on the go? You can easily keep work going on your Android phone, too. Google prides itself on being helpful, and the best of Google is built into Android phones.
    8. Share music, photos and more across devices: Nearby Share lets you easily share music, photos and other files between your nearby Android and Chrome OS devices. To share content like photos and videos with non-Android devices, you can easily use sharing built into Google Photos or several other apps that allow you to share with friends and family and keep them in an organized memory bank for the future.
    9. Customize your Home screen with Android Widgets: Widgets are helpful additions to any Home screen, putting the information that’s most important to you right at your fingertips. There will soon be 35 Google widgets available on Android, so whether you want to have easy access to Google Maps’ real-time traffic predictions or have translations at the ready so you can communicate with family and friends, Android is there to make your life a little easier.
    10. Technology that’s useful for everyone: Everyone has their own way of using their devices. That’s why we build accessible features and products that work for the various ways people want to experience the world. Whether you want to use your device without ever needing the screen using TalkBack, or you want to take what’s being said out loud and create a real-time transcript with Live Transcribe, Android has you covered when and how you need it.

    And that’s not all. Between our major annual updates, we’re always adding new features to Android.

  • Hear from app and game founders in #WeArePlay USA Wed, 29 Jun 2022 17:00:00 +0000

    Last week, we launched #WeArePlay, a new series featuring the people behind your favorite Google Play apps and games. To celebrate the Fourth of July holiday, we’re putting a special spotlight on app founders and developers across every U.S. state. #WeArePlay USA introduces you to the passionate professionals behind more than 150 growing businesses.

    A gif of a collage of headshots that turns into the shape of a U.S. map. The gif ends with the text “#WeArePlay” and the URL g.co/play/weareplay-usa

    Let’s take a quick road trip across the #WeArePlay USA collection, starting in the Big Apple. New Yorker Tanya was so inspired when her eight-year-old daughter asked to open an investment account that she created Goalsetter — an app that helps kids learn about finance through fun activities. She wants to help kids, and their parents, build stronger financial futures: “Part of my mission is to close the wealth gap in America by educating the next generation.” Read more stories from New York.

    A graphic featuring a photo of Tanya with her kids, her name, her location of “New York, New York,” the name of her app “Goalsetter” and the #WeArePlay logo and URL.

    Our next stop is Raleigh, North Carolina to meet Joe, John and Grant. They created JouleBug to help people better understand their environmental impact through interactive challenges — like competing with friends to save the most energy or reduce the most waste. “As we go through our days, it’s become easy to waste resources and not even notice it,” says Grant. “We want to draw attention to this and show how simple it is to change your habits.” Discover more stories from North Carolina.

    A graphic featuring a photo of Joe, John and Grant on a mountain, their names, their location of “Raleigh, North Carolina” the name of their app “Joulebug” and the #WeArePlay logo and URL.

    Making our way west, we meet Clarence and Edna in Tulsa, Oklahoma. They both share a passion for education and worked together to create Boddle — a 3D game that motivates kids to learn math. Using AI, Boddle also helps parents and teachers tailor learning content and track performance. Check out more stories from Oklahoma.

    A graphic featuring a photo of Clarence and Edna, their names, their location of “Tulsa, Oklahoma” the name of their app “Boddle Learning” and the #WeArePlay logo and URL.

    Our final stop brings us to Santa Monica, California with Jenova. While he was in film school, Jenova asked himself — could a game make you cry the same way a movie can? He launched thatgamecompany and started building games that tug at players’ heartstrings. His company now has close to 100 employees. Read more stories from California.

    A graphic featuring a photo of Jenova, his name, his location of “Santa Monica, California” the name of his app “thatgamecompany” and the #WeArePlay logo and URL.

    Explore the rest of the #WeArePlay USA collection, and stay tuned for more stories from around the world.

  • 4 ways creators can bounce back from setbacks Wed, 29 Jun 2022 16:00:00 +0000

    Life is never completely smooth sailing, and challenges can strike even with the best-laid plans in place. We asked creators how they handle challenges and how these obstacles — while often difficult in the moment — can serve as opportunities to learn, grow and build resilience.

    Learn from your mistakes, and do better next time

    Take Monique Elise, an author, financial analyst and lifestyle influencer passionate about empowering women. After her first brand collaboration failed, Monique realized these campaigns require more work than they appear to on the surface, and that being an influencer means much more than taking pretty photos. “I underestimated just how much work, preparation and organization goes into creating content that I’m truly proud of,” she shares. While initially disappointed in her results, she quickly shifted her mindset and learned what to do differently in the future. “Truthfully, that experience was so necessary,” she says, “because it made me understand how important it is to be prepared, especially if I want to represent my business in a meaningful way.”

    Monique wears a pink suit and black shirt while sitting on a desk.

    Monique builds her confidence from a support group of family members, friends and peers.

    Be prepared for the unexpected

    As Monique shared, being prepared is key — and that includes being prepared for the unexpected. Rae Allen learned this as she was building her brand as a fitness and style creator. Rae’s goal was to run a mile every day, and just as she was getting started, she found out she needed a series of back-to-back surgeries. She quickly turned this setback into an opportunity to grow her platform in an authentic way.

    “At first, I felt like a failure because I couldn’t technically run,” Rae shares. “But I realized I set the rules.” After her surgeries, she started walking, jogging, then finally running again — regularly posting about her recovery, and her favorite workout outfits, on Instagram. “If it weren’t for this setback, I never would have found my true passion for creating,” she says. “My platform changed immensely, as did my fitness journey as an athlete. Today people look to me for inspiration, motivation and empowerment.”

    Rae Allen runs down the street next to her father, who rides a bicycle next to her.

    Rae Allen recently celebrated 2,700 days of running a mile every day.

    Lean on your support group

    When a challenge inevitably arises, it can feel like you need to solve it on your own. But that doesn't have to be the case. Monique depends on her support group of “friends, fellow creators and my boyfriend.” She shares, “Having a support system really helps when you’re suffering from self-doubt. Also, don’t be afraid to ask for help. You’d be surprised at how many people want to help you and see you succeed.” Having a peer group to share your experiences with can be especially comforting, because they can empathize with the nature of your work and offer actionable suggestions.

    Channel resilience

    No matter the endeavor, it’s important to keep moving forward and reaching for your goals. Monique and Rae both found strength in the face of disappointment, and the determination to press onward. It’s something that still inspires Rae today, and she wants to share that spirit of resilience with every creator: “Keep going! Whatever it is you’re facing — just keep going with one foot in front of the other. There will be highs and lows and it will be hard. No matter what we do in life, we will always face obstacles. So why not face obstacles doing something you love? The journey is worth it.”

  • Mentorship and support for Black and women founders Wed, 29 Jun 2022 16:00:00 +0000

    Women-led startups received just 2.3% of venture capital funding in 2020. The venture capital industry remains male-dominated, both among decision-makers and the entrepreneurs who are successful in their pitches for investment. For Black founders, the gap is even wider, with only 1.2% of VC funding in the U.S. going to Black-led startups in 2021.

    Mentorship and access to resources are critical to closing the startup funding gap. To connect underrepresented founders to the right people and practices to help them grow, today we’re opening up applications for the Google for Startups Accelerator: Black Founders and Google for Startups Accelerator: Women Founders in North America. Applications are open now through July 28, 2022.

    Google for Startups Accelerators are ten-week programs of intensive workshops and expert mentorship for revenue-generating tech startups. Founders receive virtual mentoring and technical support from Google engineers and external experts tailored to their business, without giving up equity in return.

    To learn more about the impact of Google for Startups Accelerator mentorship on participating founders, we sat down with alumnae Ingrid Polini, cofounder and CEO of document management startup SAFETYDOCS Global, and Tiffany Whitlow, cofounder and Chief Development Officer of Acclinate, a digital health startup helping pharmaceutical companies diversify clinical trials by accessing and engaging communities of color. Ingrid was part of the 2021 Accelerator: Women Founders class, and Tiffany and her cofounder Del Smith were selected for both Accelerator: Black Founders and the Google for Startups Black Founders Fund in 2021.

    What is one piece of advice you would share with founders who are considering applying for a Google for Startups Accelerator?

    Ingrid: Be as open as you can about your business, so the team can really help you. Be present, participate and ask questions, because in the end, you’re applying your scarce time to it as a founder.

    Tiffany: Go for it. The resources and ecosystem are invaluable.

    Visit Google for Startups Accelerator: Black Founders and Google for Startups Accelerator: Women Founders to learn more about the programs, including details on how to apply.

  • Reducing gender-based harms in AI with Sunipa Dev Wed, 29 Jun 2022 16:00:00 +0000

    Natural language processing (NLP) is a form of artificial intelligence that teaches computer programs how to take in, interpret, and produce language from large data sets. For example, grammar checkers use NLP to come up with grammar suggestions that help people write grammatically correct phrases. But as Google’s AI Principles note, it’s sometimes necessary to have human intervention to identify risks of unfair bias.

    Sunipa Dev is a research scientist at Google who focuses on Responsible AI. Some of her work focuses specifically on ways to evaluate unfair bias in NLP outcomes, reducing harms for people with queer and non-binary identities. Sunipa’s work was recently featured at a workshop at the ACM Fairness, Accountability, and Transparency (FAcct) conference in Seoul, Korea.

    In our interview, she emphasizes that her work is achievable only through forging collaborative partnerships between researchers, engineers, and AI practitioners with everyday users and communities.

    What inspired you to take on this career path?

    While working on my PhD at the University of Utah, I explored research questions such as, “How do we evaluate NLP tech if they contain biases?” As language models evolved, our questions about potential harms did, too. During my postdoc work at UCLA, we ran a study to evaluate challenges in various language models by surveying respondents who identified as non-binary and had some experience with AI. With a focus on gender bias, our respondents helped us understand that experiences with language technologies cannot be understood in isolation. Rather, we must consider how these technologies intersect with systemic discrimination, erasure, and marginalization. For example, the harm of misgendering by a language technology can be compounded for trans, non-binary, and gender-diverse individuals who are already fighting against society to defend their identities. And when it’s in your personal space, like on your devices while emailing or texting, these small jabs can build up to larger psychological damage.

    What is your current role at Google?

    I am currently a Research Scientist at the Responsible AI - Human Centered Technology team. In my current role, I am working to build a better understanding of how to avoid unfair bias in AI language models across different cultures and geographies, aligned with Google’s AI Principles.

    This is a challenge because language changes, and so do cultures and regional laws as we move from one place to another. This can all impact how people express themselves, what identities they choose and how they experience discrimination on a daily basis. Gender bias can manifest in entirely different ways in different parts of the world. In some of my ongoing work that focuses on a non-Western point of view, we are working with social scientists and NGOs in India while engaging with local communities. We are using the voices of many people who are living in a specific region and asking, “What are the biases prevalent in their society?”

    What is gender bias in NLP?

    Written text and training data for language technologies can lack representation or misrepresent different gender identities; this can reflect social biases. As a result, some NLP technologies can reinforce gender stereotypes and slurs, erase people’s gender identities, or have reduced quality of service for marginalized communities. What drives me in my work is my goal to make language technologies more inclusive and usable.

    Why does this matter for AI?

    Gender can be such an integral part of someone's identity, and having that wrongly assumed by an AI system can be triggering, unfair, and harmful. We need to work towards systems and societies that do not encode unfair biases and harmful stereotypes in order to break out of the cycle of perpetuating harms of stereotyping, misgendering, and erasure.

    How can people who are not researchers, engineers or AI practitioners engage in this work?

    A very direct way is for people to report potential harms as bugs within products they use. People can also participate in open discussions in workshops, panels and town halls. These are all helpful ways to build inclusive AI.

    I want to emphasize, however, that the onus can’t only be on the user. It’s also on the side of the researcher, engineer and AI practitioner. The goal is to create a continuous feedback loop between humans and machines, with real people stepping in to ensure the creation of more responsible AI. As AI practitioners, we need to work with the people we’re trying to serve and have users collaborate with us to tell us what we need to do better.

  • Kickstart your monetization with the AdSense onboarding video series Wed, 29 Jun 2022 15:00:00 +0000

    We’re introducing the Google AdSense Onboarding video series to help publishers who are new to the program. This five-part video series will cover a range of topics from how to use the AdSense dashboard to creating ad units. It’s designed to bring new publishers like you one step closer to turning passion into a successful business model.

    The short, informative videos will help you to improve your earning potential by learning how to optimize your ads and sites. All episodes are fully available to watch on our AdSense YouTube channel.

    Here’s what the series has in store:

    You will follow Finn, a publisher new to AdSense like you, as he learns how to manage his account and how to optimize his ads and sites. Through the episodes you’ll build on your practical knowledge of the AdSense platform.

    The videos will show you how to set up the right ads format and how to create a reporting structure that works for you and your business. You will also find recommendations on how to grow your business and ways to optimize your ads by exploring new opportunities and conducting experiments on your AdSense dashboard.

    Onboarding series playlist on YouTube

    Episode 1: Your AdSense account set up

    In this video, you’ll get to know your AdSense dashboard and navigate through the account. You’ll also learn more about the Ads, Reports, Payments, and Policy Center sections of the dashboard.

    Episode 2: Managing your ads and sites

    In this video, you’ll learn about the two different ad types, Auto ads and manually-placed ad units. You’ll discover how to customize the ad types to suit your site, how to control where the ads are placed and the different ad formats. You’ll also learn about the optimal number of ads to show on your pages and how to set up page exclusions to ensure ads only appear where you want.

    Episode 3: Use AdSense blocking controls to review your ads

    In this video, you’ll discover how to protect your brand and prevent certain ads from appearing on your site. The Ad review center offers several options to review and manage ads and ad categories in an easy and efficient way.

    Episode 4: Understanding your performance, traffic and revenue

    The only way to measure a site’s growth over time is to track its performance. In this video, you’ll learn how to use pre-made and custom reports to measure account performance. You’ll discover the important metrics to monitor and which reports to use.

    Episode 5: Optimize your ads and boost your revenue

    The final video in the series recaps the key optimization tips to improve your earning potential and help set you up for success! The video focuses on four key tips; using the Ad review center to manage ads, setting up Auto ads to easily find new opportunities, conducting experiments on AdSense and to see if your account is eligible for AdSense Labs to test new features.

  • Go on an epic adventure with Netflix’s “The Sea Beast” Wed, 29 Jun 2022 14:00:00 +0000

    Craving a different type of drive this summer? Go on a high-seas adventure without stepping off land. Activate Waze’s latest driving experience, inspired by Netflix’s newest movie,The Sea Beast.” (Check out the trailer and the film on Netflix July 8.)

    Starting today, you’ll meet the dynamic duo of Maisie, a precocious stowaway, and Blue, a little beast with a huge mischief streak, and revel in the unlikely comedy of their friendship as they help you navigate every turn you take on Waze. And don’t worry: Maisie will help translate Blue’s sounds for you. You’ll also get to know some other Beasts that they find on their journey when you choose between three new Moods: Blue, Red and Yellow. Don’t forget to swap your vehicle for a Lifeboat, to get into the true adventurer’s spirit.

    With Sea Beast Mode activated, get ready to explore the world together, on a journey full of surprise, wonder and funny banter — because where the map ends, the adventure begins.

    If you’re interested in seeing the magic in real life, Netflix is hosting a series of experiences across the U.S. at aquariums, museums and more to celebrate the launch of The Sea Beast.

    For a drive that takes you to the seas, visit Waze or click “My Waze” in your Waze app and tap the “Turn on Sea Beast Mode” banner to activate. It’s available globally, in English, for a limited time.

  • It’s electric! 6 lessons from our largest electric kitchen Tue, 28 Jun 2022 16:00:00 +0000

    We recently opened our all-electric Bay View campus, which also marked the debut of our largest electric kitchen. As our biggest blueprint for fully carbon-free cafes and kitchens yet, Bay View will help advance our commitment to operate on 24/7 carbon-free energy across all of our campuses by 2030.

    Still, any big change comes with a learning curve. So whether you’re a professional chef or an at-home cook, here are six lessons we’ve learned to help you make the switch to electric:

    Electric is way faster. The benefits of electric kitchens go beyond climate impact, starting with speed. The first time I ever cooked on induction (electric) equipment, the biggest surprise was just how incredibly fast it is. In fact, induction boils water twice as fast as traditional gas equipment and is far more efficient — because unlike a flame, electric heat has nowhere to escape. At Bay View, our training programs help Google chefs appreciate and adjust to the new pace of induction. The speed truly opens up whole new ways of cooking.

    It’s also safer, simpler and cooler. Compared to traditional gas equipment, induction equipment is safer because there’s very little heat transfer after you remove a vessel, reducing burn risk. Cleanup is a simple wipe-down versus removing stainless steel grates that stay at hundreds of degrees for hours. We also think — and we’re collecting data at Bay View to confirm — that electric kitchens will be more comfortable to work in, because they're potentially cooler (you don't need to leave the heat on high) and quieter (you don’t need the hood fans as often).

    The end result is delicious. You can cook world-class food with induction equipment, and many Michelin 3-star restaurants already do. At Bay View, we did a full recipe review to match our same great flavor profiles using induction equipment. Turns out if you have the right brines, marinades, seasoning and technique, you can easily adapt recipes to electric equipment without compromising taste. For example, you can achieve the smoky taste of grilled asparagus on induction simply by giving it time in a smoker or adding smoked salt.

    A plate of vegetables is arranged to showcase grill marks.

    Great equipment is available and affordable. If you haven’t seen induction cooking equipment in a while, you might still imagine a flimsy old hot plate. Not even close. You can now match every piece of gas equipment with a well-designed (and increasingly affordable) electric equivalent, including skillets, chargrills and woks. At Bay View, we even have electric pizza ovens! At home, a lot of your old equipment will carry over: Common cast iron, stainless steel and non-stick pans are all induction friendly if they have a magnetic base. Just try sticking on a magnet!

    You still get that sizzle. Many chefs — both professional and at home — equate a gas flame with “real” cooking. But you don’t need fire to do a great job in the kitchen. For instance, you can sear proteins to enhance flavor on induction the same way you would on gas, and you still get that nice sizzle when you drop food in the pan. At Bay View (and across our cafes), we’ve leaned heavily into electric kitchen training sessions to help inspire our cooks, share best practices and give them the tools to do their best work.

    Four chefs gather around a table with two pizza pies in the center. One chef gestures towards the pies while the others review recipes.

    There are creative ways to manage the electrical load. Cooking aside, all-electric kitchens do tend to increase a building’s electrical load — but technology and planning can help manage that need. For example, as we’ve found across many of our electric cafes, smart circuit controls can automatically power down certain kitchen equipment if electricity loads get too high. At Bay View, we added spare circuits into our electrical panels for more flexibility, making it easier to accommodate changes in kitchen equipment use over time.

    These lessons apply far beyond Bay View, to kitchens big and small, and I hope they inspire others to make the move to electric. What's been most rewarding to find at Bay View, and at our other electric kitchens, is that you don't have to choose between creating something delicious and protecting the planet — you can do both.

  • Reuniting the historic Stonewall Inn Tue, 28 Jun 2022 16:00:00 +0000

    Photo of Stonewall Inn facade taken by CyArk during a documentation project in March, 2017. Learn more about the Stonewall Inn with CyArk on Google Arts & Culture

    The Stonewall Inn is known around the world as the site of the Stonewall Riots, which ignited the modern LGBTQ+ rights movement in 1969 in New York City. But at the time of the rebellion, the Stonewall Inn actually consisted of what is now two locations: 53 Christopher Street, the current location of the Stonewall Inn bar, and 51 Christopher Street next door. Over the years, as rents rose, the two sites were separated, and there was little evidence left that 51 Christopher Street played such a vital role in the heritage of the LGBTQ+ rights movement.

    On Friday, this all changed. LGBTQ+ activists, with Google’s support, joined local elected officials to break ground on the new Stonewall National Monument Visitor Center, reuniting the two sites.

    Four people hold shovels in front of a Pride flag and Google and YouTube logos.

    Ann Marie Gothard, Governor Kathy Hochul, Senator Chuck Schumer and Google and Alphabet SVP and CFO Ruth Porat at Friday's groundbreaking.

    Scheduled to officially open in the summer of 2024, the Stonewall National Monument Visitor Center’s mission is to preserve, advance and celebrate the legacy of the Stonewall Rebellion. In 2016, then-President Barack Obama designated the 0.19-acre area, formerly known as Christopher Park, and the surrounding Christopher Street as the Stonewall National Monument, making it the first U.S. national monument dedicated to the LGBTQ+ community and their fight for equal rights.

    Through a grant of $1 million from Google.org to help acquire the lease, Google is helping make this dream a reality. Visitors to the center will discover an immersive experience that takes them on a tour of LGBTQ+ history and culture. The center will host in-person and virtual tours, lectures, exhibitions and visual arts. It will also be the home base for the National Park Service Rangers who maintain the Stonewall National Monument.

    Interior photograph of Stonewall Inn bar without people inside. A wooden bar and stools are visible and alcohol is lined up behind the bar, along with T-shirts for sale.

    Photo of Stonewall Inn interior taken by CyArk during a documentation project in March, 2017. Learn more about the Stonewall Inn with CyArk on Google Arts & Culture.

    Google has been deeply invested with preserving and sharing the history of the Stonewall Riots for many years. In 2019, on the 50th anniversary of the Stonewall Rebellion, we provided support for Stonewall Forever, an interactive “living monument” sharing 50 years of LGBTQ+ history. With a $1.5 million grant from Google.org and volunteers from Google’s Creative Lab, the LGBT Community Center of New York City (The Center) launched the living monument which connects diverse voices from the Stonewall era to the stories of millions of LGBTQ+ people today. The living monument contains countless colorful pieces that people can click on to view digitized historical artifacts, oral histories and interviews from today. In the years since, participation in Stonewall Forever has grown as thousands of people have added their history by uploading photos, messages and stories.

    Illustration of the New York City skyline with a rainbow of small squares bursting out of an area of the city where the Stonewall Inn sits

    Launched in 2019 by the LGBT Community Center of New York City, Stonewall Forever is an interactive living monument sharing 50 years of LGBTQ+ history.

    Supporting the LGBTQ+ community has been a longstanding commitment from Google. By supporting the reunification of the Stonewall Inn and the development of the Stonewall National Monument Visitor Center, we’re proud to do our part to preserve and commemorate the achievements of the past and to take big steps toward a brighter, more equitable future.

  • Expanding access to clean energy careers Tue, 28 Jun 2022 15:00:00 +0000

    Climate change affects everyone, but not equally. Our fossil-based energy system has disproportionately impacted communities of color and low-income communities for generations. So as the world transitions to a carbon-free electric grid, it’s important to support programs building a just and equitable clean energy economy.

    This transition to clean energy is expected to create 10.3 million jobs by 2030, outpacing the nearly 2.7 million fossil fuel jobs of today. Google.org and Google Nest recently partnered with Dream Corps Green For All to launch The Green For All Clean Energy Scholarship Fund, which aims to expand access to clean energy careers for jobseekers from underrepresented communities.

    We recently announced our first recipients at Black Future Weekend, a Dream Corps event focused on diversifying the tech industry. As part of the application process, they shared their “green dream” and explained why they wanted a career in the renewable energy industry:

    Quianya Enge (Carbondale, Illinois)

    As someone directly impacted by the criminal justice system and now a doctoral student in Higher Education and Administration with a master’s degree in Workforce Education and Development, my dream is to build a career in the clean energy sector. Renewable energy jobs are perfect for those who need a second chance in the workforce. However, there is a negative perception of felons within the solar industry and society as a whole — and as a workforce developer, I’d like to change that. I want to build a team that helps individuals from marginalized communities find training and jobs in the solar field, and form partnerships with groups in the clean energy industry that work to reduce recidivism.

    Alcia Shaw (Brooklyn, New York)

    I grew up on a farm in Jamaica, deep within the island’s green-swathed mountains. Despite the hardships I faced growing up in a poverty-stricken country, as a young girl, I found tranquility in climbing the nearest tree and watching as the deep blue Caribbean Sea embraced the north coast. It will be a dream come true for me when our communities are no longer at risk of excess pollution, waste and questionable water sources. This scholarship will allow me to enroll in a sustainability management course at Yale University, giving me the qualifications I need to pursue my passion for equality and maintain the environmental integrity of my community and similar areas across the globe.

    Kristian Thymianos (Las Vegas, Nevada)

    To me, the clean energy sector is a way to keep my community alive despite the ongoing issues surrounding climate change. I grew up in Las Vegas, Nevada — where the only thing as intense as the city is the sun beating down on it. We contend with major issues due to climate change, like urban heat islands that threaten the health of our residents and tourists and negatively impact our infrastructure, and declining water resources. These are not unique to Las Vegas, but they impact our community more than others. ​​Finding ways to fix and provide for my hometown pushes me to do the work I do.

    Extending the impact with Nest Renew

    Last year, Google Nest unveiled Nest Renew, a service for compatible Nest thermostats in the U.S. that makes it easy to support clean energy right from home. Through the Energy Impact Program, a feature within Nest Renew, you can help direct funds to nonprofit partners working towards an equitable sustainable future.

    Later this year, Dream Corps will join GRID Alternatives and Elevate Energy as a founding partner of the Energy Impact Program, ensuring continued support for scholarship recipients and guaranteed career placement for individuals from underrepresented communities.

  • Waze helps Tour de France shift up a gear Tue, 28 Jun 2022 13:00:00 +0000

    Waze is activating a first-of-its-kind partnership and sponsorship with the Tour de France and the Tour de France Femmes avec Zwift to improve your experience at every stage of the race, in and outside the car.

    For the first time in the race’s 109-year history, the Tour de France and the Tour de France Femmes avec Zwift will join Waze’s Global Event Partner Program, harnessing the power of the app to make it easier for both locals and spectators to experience the event, from start to finish. As the Official Traffic Manager of the race, we’ll provide tools, data and insights, not only helping drivers but also athletes, fans and more.

    Plan your journey around more than 4,000 kilometers of road closures, locate temporary parking lots and monitor live traffic speeds as you navigate the action using Waze. From Copenhagen to the Champs-Élysées, Waze’s volunteer map editors will equip you with the real-time insights you need to outsmart traffic along the way.

Google Ads
Many books were created to help people understand how Google works, its corporate culture and how to use its services and products. The following books are available: Ultimate Guide to Google AdsThe Ridiculously Simple Guide to Google Docs: A Practical Guide to Cloud-Based Word ProcessingMastering Google Adwords: Step-by-Step Instructions for Advertising Your Business (Including Google Analytics)Google Classroom: Definitive Guide for Teachers to Learn Everything About Google Classroom and Its Teaching Apps. Tips and Tricks to Improve Lessons’ Quality.3 Months to No.1: The "No-Nonsense" SEO Playbook for Getting Your Website Found on GoogleUltimate Guide to Google AdsGoogle AdSense Made Easy: Monetize Your Website and Blogs Instantly With These Proven Google Adsense TechniquesUltimate Guide to Google AdWords: How to Access 100 Million People in 10 Minutes (Ultimate Series)

Google Cloud Blog

  • Built with BigQuery: How Exabeam delivers a petabyte-scale cybersecurity solution Thu, 30 Jun 2022 16:00:00 -0000

    Editor’s note: The post is part of a series highlighting our awesome partners, and their solutions, that are Built with BigQuery.

    Exabeam, a leader in SIEM and XDR, provides security operations teams with end-to-end Threat Detection, Investigation, and Response (TDIR) by leveraging a combination of user and entity behavioral analytics (UEBA) and security orchestration, automation, and response (SOAR) to allow organizations to quickly resolve cybersecurity threats. As the company looked to take its cybersecurity solution to the next level, Exabeam partnered with Google Cloud to unlock its ability to scale for storage, ingestion, and analysis of security data.

    Harnessing the power of Google Cloud products including BigQuery, Dataflow, Looker, Spanner and Bigtable, the company is now able to ingest data from more than 500 security vendors, convert unstructured data into security events, and create a common platform to store them in a cost-effective way. The scale and power of Google Cloud enables Exabeam customers to search multi-year data and detect threats in seconds


    Google Cloud provides Exabeam with three critical benefits.  

    • Global scale security platform. Exabeam leveraged serverless Google Cloud data products to speed up platform development. The Exabeam platform supports horizontal scale with built-in resiliency (backed by 99.99% reliability) and data backups in three other zones per region. Also, multi-tenancy with tenant data separation, data masking, and encryption in transit and at rest are backed up in the data cloud products Exabeam uses from Google Cloud.

    • Scale data ingestion and processing. By leveraging Google’s compute capabilities, Exabeam can differentiate itself from other security vendors that are still struggling to process large volumes of data. With Google Cloud, Exabeam can provide a path to scale data processing pipelines. This allows Exabeam to offer robust processing to model threat scenarios with data from more than 500 security and IT vendors in near-real time. 

    • Search and detection in seconds. Traditionally, security solutions break down data into silos to offer efficient and cost-effective search. Thanks to the speed and capacity of BigQuery, Security Operations teams can search across different tiers of data in near real time. The ability to search data more than a year old in seconds, for example, can help security teams hunt for threats simultaneously across recent and historical data. 

    Exabeam joins more than 700 tech companies powering their products and businesses using data cloud products from Google, such as BigQuery, Looker, Spanner, and Vertex AI. Google Cloud announced theBuilt with BigQuery initiative at the Google Data Cloud Summit in April, which helps Independent Software Vendors like Exabeam build applications using data and machine learning products. By providing dedicated access to technology, expertise, and go-to-market programs, this initiative can help tech companies accelerate, optimize, and amplify their success. 

    Google’s data cloud provides a complete platform for building data-driven applications like those from Exabeam — from simplified data ingestion, processing, and storage to powerful analytics, AI, ML, and data sharing capabilities — all integrated with the open, secure, and sustainable Google Cloud platform. With a diverse partner ecosystem and support for multi-cloud, open-source tools, and APIs, Google Cloud can help provide technology companies the portability and the extensibility they need to avoid data lock-in.   

    To learn more about Exabeam on Google Cloud, visit www.exabeam.com. Click here to learn more about Google Cloud’s Built with BigQuery initiative. 

    We thank the many Google Cloud team members who contributed to this ongoing security collaboration and review, including Tom Cannon and Ashish Verma in Partner Engineering.

    Related Article

    CISO Perspectives: June 2022

    Google Cloud CISO Phil Venables shares his thoughts on the RSA Conference and the latest security updates from the Google Cybersecurity A...

    Read Article
  • Cloud Monitoring metrics, now in Managed Service for Prometheus Thu, 30 Jun 2022 16:00:00 -0000

    According to a recent CNCF survey, 86% of the cloud native community reports that they use Prometheus for observability. As Prometheus becomes more of a standard, an increasing number of developers are becoming fluent in PromQL, Prometheus’ built-in query language. While it is a powerful, flexible, and expressive query language, PromQL is typically only able to query Prometheus time series data. Other sources of telemetry, such as metrics offered by your Cloud provider or metrics generated from logs, remain isolated in separate products and might require developers to learn new query tools in order to access them.

    Introducing PromQL for Google Cloud Monitoring metrics

    Prometheus metrics alone aren’t enough to get a single pane of glass view of your Cloud footprint. Cloud Monitoring provides over 1,000 free metrics that let you monitor and alert on your usage of Google Cloud services, including metrics for Compute Engine, Kubernetes Engine, Load Balancing, BigQuery, Cloud Storage, Pub/Sub, and more. We’re excited to announce that you can now query all Cloud Monitoring metrics using PromQL and Managed Service for Prometheus, including Google Cloud system metrics, Kubernetes metrics, log-based metrics, and custom metrics.

    grafana system metrics.gif
    Google Cloud metrics appear within Grafana and can be queried using PromQL.

    Because we built Managed Service for Prometheus on top of the same planet-scale time series database as Cloud Monitoring, all your metrics are stored together and are queryable together. Metrics in Cloud Monitoring are automatically generated when you use Google Cloud services at no additional cost to you. View all your metrics in one place with the query language that developers already know and prefer, opening up possibilities such as:

    Exposing these metrics using PromQL means that developers who are familiar with Prometheus can start using all time series telemetry data without first having to learn a new query language. New members of your operations team can ramp up faster, as many industry hires will already be familiar with PromQL from previous experience.

    Why Managed Service for Prometheus

    In addition to PromQL for all metrics, Managed Service for Prometheus offers open-source monitoring combined with the scale and reliability of Google services. Additional benefits include: 

    How to get started

    You can query Cloud Monitoring metrics with PromQL by using the interactive query page in Cloud Console or Grafana. To learn how to write PromQL for Google Cloud metrics, see Mapping Cloud Monitoring metric names to PromQL. To configure a Grafana data source that can read all your metrics in Cloud Monitoring, see Configure a query user interface in the Managed Service for Prometheus documentation.

    To query Prometheus data alongside Cloud Monitoring, you have to first get Prometheus data into the system. For instructions on configuring Managed Service for Prometheus ingestion, see Get started with managed collection.

    Related Article

    Google Cloud Managed Service for Prometheus is now generally available

    Announcing the GA of Google Cloud Managed Service for Prometheus for the collection, storage, and querying of Kubernetes metrics.

    Read Article
  • Announcing Apigee Advanced API Security for Google Cloud Thu, 30 Jun 2022 16:00:00 -0000

    Organizations in every region and industry are developing APIs to enable easier and more standardized delivery of services and data for digital experiences. This increasing shift to digital experiences has grown API usage and traffic volumes. However, as malicious API attacks also have grown, API security has become an important battleground over business risk. 

    To help customers more easily address their growing API security needs, Google Cloud is announcing today the Preview of Advanced API Security, a comprehensive set of API security capabilities built on Apigee, our API management platform. Advanced API Security enables organizations to more easily detect security threats. Here’s a closer look at the two key functionality included in this launch: identifying API misconfigurations and detecting bots.

    Identify API misconfigurations

    Misconfigured APIs are one of the leading reasons for API security incidents. In 2017, Gartner® predicted that by 2022 API abuses will be the most frequent attack vector resulting in data breaches for enterprise web applications. Today, our customers tell us application API security is one of their top concerns, which is supported by an independent study from 2021 by Fugue and Sonatype. The report found that misconfigurations are the number one cause of data breaches, and that “too many cloud APIs and interfaces to adequately govern” are frequently the main point of attack in cyberattacks.

    While identifying and resolving API misconfigurations is a top priority for many organizations, the configuration management process can be time consuming and require considerable resources.

    Advanced API Security can make it easier for API teams to identify API proxies that do not conform to security standards. To help identify APIs that are misconfigured or experiencing abuse, Advanced API Security regularly assesses managed APIs and provides API teams with a recommended action when configuration issues are detected.

    1 Advanced API Security.jpg
    Advanced API Security identifies misconfigured API proxies, including the missing CORS policy.

    APIs form an integral part of the digital connective tissue that make modern medicine run smoothly for patients and healthcare staff. One common healthcare API use case occurs when a healthcare organization inputs a patient's medical coverage information into a system that works with insurance companies. Almost instantly, that system determines the patient's coverage for a specific medication or procedure, a process which is enabled by APIs. Because of the often-sensitive personal healthcare data being transmitted, it is important that the required authentication and authorization policies are implemented so that only authorized users, such as an insurance company, can access the API. 

    Advanced API Security can detect if those required policies have not been applied, an alert which can help reduce the surface area of API security risks. By leveraging Advanced API Security, API teams at healthcare organizations can more easily detect misconfiguration issues and can reduce security risks to sensitive information. 

    Detect Bots

    Because of the increasing volume of API traffic, there is also an increase in cybercrime in the form of API bot attacks—the automated software programs deployed over the Internet for malicious purposes like identity theft. 

    Advanced API Security uses pre-configured rules to help provide API teams an easier way to identify malicious bots within API traffic. Each rule represents a different type of unusual traffic from a single IP address. If an API traffic pattern meets any of the rules, Advanced API Security reports it as a bot.

    Additionally, Advanced API Security can speed up the process of identifying data breaches by identifying bots that successfully resulted in the HTTP 200 OK success status response code.

    2 Advanced API Security.jpg
    Advanced API Security helps visualize Bot traffic per API proxy.

    Financial services APIs are frequently the target of malicious bot attacks due to the high-value data that is processed. A bank that has adopted open banking standards by making APIs accessible to customers and partners can use Advanced API Security to make it easier to analyze traffic patterns and identify the sources of malicious traffic. You may experience this when your bank allows you to access your data with a third-party application. While a malicious hacker could try to use a bot to access this information, Advanced API Security can help the bank’s API team to identify and stop malicious bot activity in API traffic.

    API Security at Equinix

    Equinix powers the world’s digital leaders, bringing together and interconnecting infrastructure to fast-track digital advantage. Operating a global network of more than 240 data centers with a 99.999% or greater uptime, Equinix simplifies global interconnections for organizations, saving customers time and effort with the Apigee API management platform.  

    “A key enabler of our success is Google’s Apigee, delivering digital infrastructure services securely and quickly to our customers and partners,” said Yun Freund, senior vice president of Platform at Equinix. “Security is a key pillar to our API-first strategy and Apigee has been instrumental in enabling our customers to securely bridge the connections they need for their businesses to easily identify potential security risks and mitigate threats in a timely fashion. As our API traffic has grown, so has the amount of time and effort required to secure our APIs. Having a bundled solution in one managed platform gives us a differentiated high-performing solution.”

    Getting started

    To learn more, check out the documentation or contact us to request access to get started with Advanced API Security.

    To learn more about API security best practices, please register to attend our Cloud OnAir webcast on Thursday, July 28th, 2:00 pm PT.

    Gartner, API Security: What You Need to Do to Protect Your APIs, Mark O'Neill, Dionisio Zumerle, Jeremy D'Hoinne, 28 August 2019

    GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

    Related Article

    CISO Perspectives: June 2022

    Google Cloud CISO Phil Venables shares his thoughts on the RSA Conference and the latest security updates from the Google Cybersecurity A...

    Read Article
  • Bonjour Paris: New Google Cloud region in France is now open Thu, 30 Jun 2022 09:00:00 -0000

    At Google Cloud, we recognize that to be truly global, we must be local too. This means we need to be as close as possible to our customers, their locations, their regulations, and their values. Today, we’re excited to announce another step towards this goal: Our new Google Cloud region in Paris, France is officially open. 

    Designed to help break down the barriers to cloud adoption in France, the new France region (europe-west 9) puts uniquely scalable, sustainable, secure, and innovative technology within arm’s reach, so that French organizations can embrace and drive digital transformation. A recent report indicates that Google Cloud’s impact on the productivity of French firms will support €2.4B - €2.6B in GDP growth and 13,000 - 14,000 jobs by 2027. Separately, the report details the impact of Google's infrastructure investments in France, which will support €490M in GDP growth and 4,600 jobs by 2027.

    Focusing on France

    Google Cloud’s global network is the cornerstone of our cloud infrastructure, helping you serve your customers better with high-performance, low-latency, and sustainable services. With the new France region, we now offer 34 regions, 103 zones and available in more than 200 countries and territories across the globe. 

    The region launches with three cloud zones and our standard set of services including Compute Engine, Google Kubernetes Engine, Cloud Storage, Persistent Disk, CloudSQL, and Cloud Identity. In addition, it offers core controls to enable organizations to meet their unique compliance, privacy, and digital sovereignty needs.

    For the first time ever, both public and private organizations within France will be able to run their applications, store data locally, and better leverage real-time data, analytics, and AI technologies to differentiate, streamline, and transform their business—all on the cleanest cloud in the industry.

    “In order for Renault Group to become a tech company and accelerate its digital transformation, it is important to have what is best in the market. This new Google Cloud region in France is synonymous with more security, resilience and sovereignty, and lower latency, which altogether reinforces the value of the cloud solutions. We can therefore be certain to offer the highest level of services for our users and ultimately the best customer experience. It is also a more eco-friendly infrastructure that supports our efforts in sustainability, without compromising efficiency.” - Frédéric Vincent, Head of Information systems and Digital, Renault Group 

    “This new Google Cloud region brings us a smarter, more secure and local cloud. It enables us to comply with French and European security, compliance and sovereignty requirements, and is an opportunity to better serve our customers with new and always more relevant offerings.” - Pascal Luigi, Executive General Manager, BforBank 

    Tackling Europe’s digital challenges together 

    The new Paris region will allow local organizations from the private and public sector to take advantage of a transformation cloud to be:

    • Smarter: Data is the core ingredient in any business transformation.  Google Cloud enables you to unify data across the organization and leverage smart analytics capabilities and AI solutions to get the most value from structured or unstructured data, regardless of where it is stored. 

    • Open: Google Cloud’s commitment to multicloud, hybrid cloud, and open source provides the freedom to choose the best technology and the flexibility to fit specific needs, apps, and services while allowing developers to build and innovate faster, in any environment. 

    • Sustainable: At Google we’re working to build a carbon-free future for everyone. We are the only major cloud provider to purchase enough renewable energy to cover our entire operations and are working closely with every industry to help increase climate resilience by applying cloud technology to key challenges like responsible materials sourcing, climate risk analysis, and more. 

    • Secure: Google Cloud offers a zero-trust architecture to comprehensively protect data, applications, and users against potential threats and minimize attacks. We also work closely with local partners to help support compliance with local regulations. 

    Across Europe, companies of all sizes and in every industry are looking to migrate their mission-critical workloads and data to the cloud. But despite the proven benefits of cloud—from agility to scalability to performance and innovation potential—many IT decision makers have opted for lesser technology capabilities due to lack of trust. 

    Beyond powerful, embedded security capabilities, Google Cloud provides controls to help meet your unique compliance, privacy, and digital sovereignty needs, such as the ability to keep data in a European geographic region, local administrative and customer support, comprehensive visibility and control over administrative access, and encryption of data with keys that you control and manage outside of Google Cloud’s infrastructure.

    We have also formed a strategic partnership with French cybersecurity leader Thales to develop a trusted cloud offering, specifically designed to meet the sovereign cloud criteria defined by the French government. This new France cloud region will enable the development of  local offerings from this partnership, confirming our trajectory to become a “Cloud de confiance,”  as defined by the French authorities. Our customers in France will benefit from a cloud that meets their requirements for security, privacy, and sovereignty without having to compromise on functionality or innovation. 

    Visit our Paris region page for more details about the region, and our cloud locations page, where you’ll find updates on the availability of additional services and regions.

    Related Article

    Ciao, Milano! New cloud region in Milan now open

    The new Milan region provides low-latency, highly available services with international security and data protection standards.

    Read Article
  • What’s new with Google Cloud Wed, 29 Jun 2022 21:00:00 -0000

    Want to know the latest from Google Cloud? Find it here in one handy location. Check back regularly for our newest updates, announcements, resources, events, learning opportunities, and more. 

    Tip: Not sure where to find what you’re looking for on the Google Cloud blog? Start here: Google Cloud blog 101: Full list of topics, links, and resources.

    Week of June 27 - July 1, 2022

    • Launched Query Insights for Cloud Spanner - a new visualization tool for visualizing Query performance metrics and debugging Query Performance issues  in the Cloud console!
    • Now in preview, BigQuery BI Engine Preferred Tables. Preferred tables enable BigQuery customers to prioritize specific tables for acceleration by BI Engine to ensure predictable performance and optimized use of resources. Read our blog to learn more.
    • MITRE ATT&CK® mappings for Google Cloud security capabilities through our research partnership with the MITRE Engenuity Center for Threat-Informed Defense. Learn more.
    • Launched a new way of accessing billing information — from the Cloud Console mobile app. Now, with your Android or iOS mobile device, you can access not only your resources (App Engine, Compute, Databases, Storage or IAM), logs, incidents, errors, but also your billing information. With these enhanced billing features, we are making it easier for you to understand your cloud spend. 
    • Eventarc adds support for Firebase Realtime Database. Now you can create Eventarc triggers to send Firebase Realtime Database events to your favorite destinations that Eventarc supports. 
    • PostgreSQL interface for Cloud Spanner is generally available. The PostgreSQL interface for Spanner combines the scalability and reliability of Spanner that enterprises trust with the familiarity and portability of PostgreSQL that development teams love. Devops teams that have scaled their databases with brittle sharding or complex replication can now simplify their architecture with Spanner, using the tools and skills they already have. Get started today, for as low as $65 USD/month. Learn more.

    Week of June 20 - June 24, 2022

    • Read the latest Cloud Data Hero Story. This edition focuses on Francisco, the founder of Direcly, a Google Cloud partner. Francisco immigrated from Quito, Ecuador and founded his company from the ground up, without any external funding. Now, he’s finding innovative ways to leverage Google Cloud’s products for companies like Royal Caribbean International.

    Week of June 13 - June 17, 2022

    • Launched higher reservation limits for BigQuery BI Engine. BigQuery BI Engine now supports a default maximum reservation of 250GB per project for all customers. Previously this was at 100GB. You can still request additional BI Engine reservations for your projects here. This is being rolled out in the Google Cloud Console over the next few days to all customers. Alternatively, all customers can already use DDL statement as follows 

      • ALTER BI_CAPACITY `<PROJECT_ID>.region-<REGION>.default` SET OPTIONS(size_gb = 250);

    • Don’t miss our first ever Google Cloud Sustainability Summit on June 28, 2022. Learn how business and technology leaders are building for the future, and get insights to help you enact sustainable change within your organization. At this digital event, you’ll have a chance to explore the latest tools and best practices that can help you solve your most complex challenges. And you’ll be among the first to find out about product updates across Google Cloud, Earth Engine, and Google Workspace. Register today for this no-cost, solution-packed event.
    • On June 14, 2022, we are unveiling the winners of this year’s Google Cloud Customer Awards.We received an unprecedented number of entries and every participant can be proud of what their organization is achieving in the cloud today. The second annual Google Cloud Customer Awards celebrates organizations around the world who have continued to flex and adapt to new demands, while turning new ideas into interesting realities. Read our blog to check out the results.
    • The Cloud Digital Leader track is now part of the Google Cloud career readiness program, available for eligible faculty preparing their students for a cloud-first workforce. Students will build cloud literacy and learn the value of Google Cloud in driving digital transformation while also preparing for the Cloud Digital Leader certification exam. Learn more.

    Week of June 6 - June 10, 2022

    • Artifact Registry - Audit logs for Maven, npm, and Python repositories are now available in Cloud Logging. Documentation
    • Cloud Deploy New Region - Cloud Deploy is now available in the australia-southeast1 (Syndey) region. Release Notes
    • Cloud Deploy Terraform provider support. Cloud Deploy declarative resources, Delivery Pipeline and Target, are now available via the Google Cloud Deploy Terraform Provider. Documentation
    • Anthos on VMware user cluster lifecycle from the Google Cloud Console isin GA now. You will now be able to create, delete, update, and see Anthos on VMware user clusters from the Google Cloud Console. To learn more about the feature, check out  the Anthosdocumentation.
    • Granular instance sizing for Cloud Spanner is now generally available. Get started for as low as $40 per month and take advantage of 99.999% availability and scale as needed without downtime. With granular instance sizing, at a much lower cost you can still get all of the Spanner benefits like transparent replication across zones and regions, high-availability, resilience to different types of failures, and the ability to scale up and down as needed without any downtime.  Learn more.

    Week of May 30 - June 3, 2022

    • Google Cloud Deploy support for Skaffold version 1.37.1 has been updated to version 1.37.2, which is now the default Skaffold version. (Skaffold Docs)
    • Google Cloud just made it easier to compare the cost of modernization options. Want to look at Lift & Shift vs. Containerization options? The latest version of our fit assessment now includes cost guidance. See the release notes for more details.
    • Did you notice the new “Protect” tab in Google Kubernetes Engine? Protect for GKE automatically scans, identifies and suggests fixes for workload configuration risks by comparing your running workload config against industry best practices like the Kubernetes Pod Security Standards. Check out the documentation to learn more.
    • Google Cloud makes data warehouse migrations even easier with automated SQL translation as part of the BigQuery Migration Service. Learn more.
    • Google Cloud simplifies customer verification and benefits processing with Document AI for Identity cards now generally available. Automate identity verification and fraud detection workflows by extracting information from identity cards with a high degree of accuracy. Learn more.

    Week of May 23 - May 27, 2022

    • Artifact Registry now is available in more regions. Artifact Registry is now available in the following regions - europe-west9 (Paris, France), europe-southwest1 (Madrid, Spain), and us-east5 Columbus, United States). Release Notes 
    • Change streams for Cloud Spanner is now generally available.With change streams, Spanner users are now able to track and stream out changes (inserts, updates, and deletes) from their Cloud Spanner database in near real-time. Learn more.
    • Artifact Registry now supports new repository types. Apt and Yum repositories are now generally available. Release Notes
    • Business Messages announces expansion of its partner ecosystem to includeTwilio, Genesys, and Avaya - each widely recognized global platforms for customer care and communications. Read how they help businesses implement both AI Bot and Live Agent chat solutions to stay open for conversations and advance customers through the purchase funnel. And be sure to check out the new Business Messages partner directory!
    • Learn how to set up metrics and alerts to monitor errors in Cloud SQL for SQL Server error log using Google Cloud’s Operation Suite with this blog post.

    Week of May 16 - May 20, 2022

    • Machine learning is among the most exciting, fastest-moving technology disciplines. Join us June 9th for Google Cloud Applied ML Summit, a digital event that brings together some of the world’s leading ML and data science professionals to explore the latest cutting-edge AI tools for developing, deploying, and managing ML models at scale.
    • Join us virtually on June 2nd at the Google Cloud Startup Summit where you’ll hear the latest announcements about how we’re investing in and supporting the startup ecosystem. You'll also learn from technology experts about streamlining your app development and creating better user experiences, and get insights from innovative venture capitalists and founders to help your startup grow. This event is headlined by our keynote with Google Cloud CEO Thomas Kurian and Dapper Labs Co-Founder and CEO Roham Gharegozlou as they discuss the paradigm changes being brought by web3 and how startups can prepare for this shift.
    • Google Cloud Managed Service for Prometheus introduced a new high-usage pricing tierto bring more value for Kubernetes users who want to move all of their metrics operations to the service, and dropped the pricing for existing tiers by 25 percent.
    • Hear from the SRE teamat Maisons du Monde detail their journey from building open source Prometheus to deciding that Managed Service for Prometheus was the best fit for their organization.
    • Google Cloud has launched Autonomic Security Operations (ASO) for the U.S. public sector, a solution to modernize threat management, in line with the objectives of the White House Executive Order 14028 and Office of Management and Budget M-21-31. ASO is a transformational approach to security operations, powered by our Chronicle and Siemplify, to comprehensively detect and respond to cyber telemetry across an agency while meeting the Event Logging Tier requirements of the EO.

    Week of May 9 - May 13, 2022

    • We just published a blog post announcing the latest Google Cloud’s STAC-M3™ benchmark results. Following up on our 2018 STAC-M3 benchmark audit, a redesigned Google Cloud architecture achieved significant improvements: Up to 18x faster, Up to 9x higher throughput, and new record in STAC-M3.ß1.1T.YRHIBID-2.TIME. We also published a whitepaper on how we designed and optimized the cluster for API-driven cloud resources.
    • Security Command Center (SCC) released new finding types that alert customers when SCC is either misconfigured or configured in a way that prevents it from operating as expected. These findings provide remediation steps to return SCC to an operational state. Learn more and see examples.

    Week of May 2 - May 6, 2022

    • As part of Anthos release 1.11, Anthos Clusters on Azure and Anthos Clusters on AWS now support Kubernetes versions 1.22.8-gke.200 and 1.21.11-gke.100. As a preview feature, you can now choose Windows as your node pool image type when you create node pools with Kubernetes version 1.22.8. For more information, check out the Anthos multi cloud website.
    • The Google Cloud Future of Data whitepaper explores why the future of data will involve three key themes: unified, flexible, and accessible.
    • Learn about BigQuery BI Engine and how to analyze large and complex datasets interactively with sub-second query response time and high concurrency. Now generally available.
    • Announcing the launch of the second series of the Google Cloud Technical Guides for Startups, a video series for technical enablement aimed at helping startups to start, build and grow their businesses.
    • Solving for food waste with data analytics in Google Cloud. Explore why it is so necessary as a retailer to bring your data to the cloud to apply analytics to minimize food waste.
    • Mosquitoes get the swat with new Mosquito Forecast built by OFF! Insect Repellents and Google Cloud. Read how SC Johnson built an app that predicts mosquito outbreaks in your area.

    Week of April 25 - April 29, 2022

    Week of April 18 - April 22, 2022 

    Week of April 11 - April 15, 2022 

    • Machine learning company Moloco uses Cloud Bigtable to process 5+ million ad bid requests per second. Learn how Moloco uses Bigtable to keep up in a speedy market and process ad requests at unmatched speed and scale.
    • The Broad Institute of MIT and Harvard speeds scientific research with Cloud SQL. One of our customers, the Broad Institute, shares how they used Cloud SQL to accelerate scientific research. In this customer story, you will learn how the Broad Institute was able to get Google’s database services up and running quickly and lower their operational burden by using Cloud SQL.
    • Data Cloud Summit ‘22 recap blog on April 12: Didn’t get a chance to watch the Google Data Cloud Summit this year? Check out our recap to learn the top five takeaways - learn more about product announcements, customer speakers, partners, product demos and check out more resources on your favorite topics.
    • The new Professional Cloud Database Engineer certification in beta is here. By participating in this beta, you will directly influence and enhance the learning and career path for Cloud Database Engineers globally. Learn more and sign up today.
    • Learn how to use Kubernetes Jobs and cost-optimized Spot VMs to run and manage fault-tolerant AI/ML batch workloads on Google Kubernetes Engine.
    • Expanding Eventarc presence to 4 new regions—asia-south2, australia-southeast2, northamerica-northeast2, southamerica-west1. You can now create Eventarc resources in 30 regions.

    Week of April 4 - April 8, 2022 

    • Join us at the Google Data Cloud Summit on Wednesday, April 6, at 9 AM PDT.  Learn how Google Cloud technologies across AI, machine learning, analytics, and databases have helped organizations such as Exabeam, Deutsche Bank, and PayPal to break down silos, increase agility, derive more value from data, and innovate faster. Register today for this no cost digital event.
    • Announcing the first Data Partner Spotlight, on May 11th 
      We saved you a seat at the table to learn about the Data Cloud Partners in the Google Cloud ecosystem. We will spotlight technology partners, and deep dive into their solutions, so business leaders can make smarter decisions, and solve complex data challenges with Google Cloud. Register today for this digital event
    • Introducing Vertex AI Model Registry, a central repository to manage and govern the lifecycle of your ML models. Designed to work with any type of model and deployment target, including BigQuery ML, Vertex AI Model Registry makes it easy to manage and deploy models. Learn more about Google’s unified data and AI offering.
    • Vertex AI Workbenchis now GA, bringing together Google Cloud’s data and ML systems into a single interface so that teams have a common toolset across data analytics, data science, and machine learning. With native integrations across BigQuery, Spark, Dataproc, and Dataplex data scientists can build, train and deploy ML models 5X faster than traditional notebooks. Don’t miss this ‘How to’ session from the Data Cloud Summit.

    Week of Mar 28 - April 1, 2022

    • Learn how Google Cloud’s network and Network Connectivity Center can transform the private wires used for voice trading.
    • Anthos bare metal 1.11 minor release is available now. Containerd is the default runtime in Anthos clusters on bare metal in this release.  Examples of the feature enhancements are as below:
        • Upgraded Anthos clusters on bare metal to use Kubernetes version 1.22;

        • AddedEgress Network Address Translation (NAT) gateway capability to provide persistent, deterministic routing for egress traffic from clusters

        • Enabled IPv4/IPv6 dual-stack support

        • Additional enhancements in the release can be found in the the release note  here

    Week of Mar 21 - Mar 25, 2022

    • Google Cloud’s Behnaz Kibria reflects on a recent fireside chat that she moderated with Google Cloud’s Phil Moyer and former SEC Commissioner, Troy Paredes at FIA Boca. The discussion focused on the future of markets and policy, the new technologies that are already paving the way for greater speed and transparency, and what it will take to ensure greater resiliency, performance and security over the longer term. Read the blog.
    • Eventarc adds support for Firebase Alerts. Now you can create Eventarc triggers to send Firebase Alerts events to your favorite destinations that Eventarc supports.
    • Now you can control how your alerts handle missing data from telemetry data streams using Alert Policies in the Cloud Console or via API. In cloud ecosystems there are millions of data sources, and often, there are pauses or breaks in their telemetry data streams. Configure how this missing data influences your open incidents:

      • Option 1: Missing data is treated as “above the threshold”- and your incidents will stay open.

      • Option 2: missing data is evaluated as “below the threshold” and the incident will close after your retest window period.

    Week of Mar 14 - Mar 18, 2022

    • Natural language processing is a critical AI tool for understanding unstructured, often technical healthcare information, like clinical notes and lab reports. See how leading healthcare organizations are exploring NLP to unlock hidden value in their data.
    • A handheld lab: Read how Cue Health is revolutionizing healthcare diagnostics for COVID-19 and beyond—all from the comfort of home.
    • Providing reliable technical support for an increasingly distributed, hybrid workforce is becoming all the more crucial, and challenging. Cloud Customer Care has added a range of new offerings and features for businesses of all sizes to help you find the Google Cloud technical support services that are best for your needs and budget.
    • #GoogleforGames Dev Summit is NOW LIVE. Watch the keynote followed by over 20 product sessions on-demand to help you build high quality games and reach audiences around the world. Watch → g.co/gamedevsummit
    • Meeting (and ideally, exceeding) consumer expectations today is often a heavy lift for many companies—especially those running modern apps on legacy, on-premises databases. Read how Google Cloud database services provide you the best options for industry-leading reliability, global scale & open standards, enabling you to make your next big idea a reality. Read this blog.

    Week of Mar 07 - Mar 11, 2022

    • Learn how Google Cloud Partner Advantage partners help customers solve real-world business challenges in retail and ecommerce through data insights.
    • Introducing Community Security Analytics, an open-source repository of queries for self-service security analytics. Get started analyzing your own Google Cloud logs with BigQuery or Chronicle to detect potential threats to your workloads, and to audit usage of your data. Learn more.
    • On a mission to accelerate the world's adoption of a modern approach to threat management through Autonomic Security Operations, our latest update expands our ASO technology stack with Siemplify, offers a solution to the latest White House Executive Order 14028, introduces a community-based security analytics repository, and announces key R&D initiatives that we’re investing in to bolster threat-informed defenses worldwide. Read more here
    • Account defender, available today in public preview, is a feature in reCAPTCHA Enterprise that takes behavioral detection a step further. It analyzes the patterns of behavior for an individual account, in addition to the patterns of behavior of all user accounts associated with your website. Read more here.
    • Maximize your Cloud Spanner savings with new committed use discounts. Get up to 40% discount on Spanner compute capacity by purchasing committed use discounts. Once you make a commitment to spend a certain amount on an hourly basis on Spanner from a billing account, you can get discounts on instances in different instance configurations, regions, and projects associated with that billing account. This flexibility helps you achieve a high utilization rate of your commitment across regions and projects without manual intervention, saving you time and money. Learn more. 
    • In many places across the globe, March is celebrated as Women’s History Month, and March 8th, specifically, marks the day known around the world as International Women’s Day. Google Cloud, in partnership with Women Techmakers, has created an opportunity to bridge the gaps in the credentialing space by offering a certification journey for Ambassadors of the Women Techmakers community. Learn more.
    • Learn how to accelerate vendor due diligence on Google Cloud by leveraging third party risk management providers.
    • Hybrid work should not derail DEI efforts. If you’re moving to a hybrid work model, here’s how to make diversity, equity and inclusion central to it.
    • Learn how Cloud Data Fusion provides scalable data integration pipelines to help consolidate a customer’s SAP and non-SAP datasets within BigQuery.
    • Hong Kong–based startup TecPal builds and manages smart hardware and software for household appliances all over the world using Google Cloud. Find out how.
    • Eventarc adds support for Firebase Remote Config and Test Lab in preview. Now you can create Eventarc triggers to send Firebase Remote Config or Firebase Test Lab events to your favorite destinations that Eventarc supports. 
    • Anthos Service Mesh Dashboard is now available (public preview) on the Anthos clusters on Bare Metaland Anthos clusters on VMware . Customers can now get out-of-the-box telemetry dashboards to see a services-first view of their application on the Cloud Console.
    • Micro Focus Enterprise Server Google Cloud blueprint performs an automated deployment of Enterprise Server inside a new VPC or existing VPC. Learn more.
    • Learn how to wire your application logs with more information without adding a single line of code and get more insights with the new version of the Java library.
    • Pacemaker Alerts in Google Cloudcluster alerting enables the system administrator to be notified about critical events of the enterprise workloads in GCP like the SAP solutions.

    Week of Feb 28 - Mar 04, 2022

    • Announcing the Data Cloud Summit, April 6th!—Ready to dive deep into data? Join us at the Google Data Cloud Summit on Wednesday, April 6, at 9 AM PDT. This three-hour digital event is packed with content and experiences designed to help you unlock innovation in your organization. Learn how Google Cloud technologies across AI, machine learning, analytics, and databases have helped organizations such as Exabeam, Deutsche Bank, and PayPal to break down silos, increase agility, derive more value from data, and innovate faster. Register today for this no cost digital event.
    • Google Cloud addresses concerns about how its customers might be impacted by the invasion of Ukraine. Read more.
    • Eventarc is now HIPAA compliant— Eventarc is covered under the Google Cloud Business Associate Agreement (BAA), meaning it has achieved HIPAA compliance. Healthcare and life sciences organizations can now use Eventarc to send events that require HIPAA compliance.
    • Eventarc trigger for Workflows is now available in Preview. You can now select Workflows as a destination to events originating from any supported event provider
    • Error Reporting automatically captures exceptions found in logs ingested by Cloud Logging from the following languages: Go, Java, Node.js, PHP, Python, Ruby, and .NET, aggregates them, and then notifies you of their existence.
    • Learn moreabout how USAA partnered with Google Cloud to transform their operations by leveraging AI to drive efficiency in vehicle insurance claims estimation.
    • Learn how Google Cloud and NetApp’s ability to “burst to cloud”, seamlessly spinning up compute and storage on demand accelerates EDA design testing.
    • Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.
    • Google Cloud Easy as Pie Hackathon, the results are in.
    • VPC Flow Logs Org Policy Constraints allow users to enforce VPC Flow Logs enablement across their organization, and impose minimum and maximum sampling rates. VPC Flow Logs are used to understand network traffic for troubleshooting, optimization and compliance purposes.
    • Google Cloud Managed Service for Prometheus is now generally available. Get all of the benefits of open source-compatible monitoring with the ease of use of Google-scale managed services. 
    • Google Cloud Deploy now supports Anthos clusters bringing opinionated, fully managed continuous delivery for hybrid and multicloud workloads. Cloud Deploy provides integrated best practices, security, and metrics from a centralized control plane.
    • Learn Google Workspace’s vision for frontline workers and how our Frontline solution innovations can bridge collaboration and productivity across workforce in-office and remote.

    Week of Feb 21 - Feb 25, 2022

    • Read how Paerpay promotes bigger tabs and faster, more pleasant transactions with Google Cloud  and the Google for Startups Cloud Program.
    • Learn about the advancements we’ve released for our Google Cloud Marketplace customers and partners in the last few months.
    • BBVA collaborated with Google Cloud to create one of the most successful Google Cloud training programs for employees to date. Read how they did it
    • Google for Games Developer Summit returns March 15 at 9AM PT! Learn about our latest games solutions and product innovations. It’s online and open to all. Check out the full agenda g.co/gamedevsummit 
    • Build a data mesh on Google Cloud with Dataplex (now GA 🎉). Read how Dataplex enables customers to centrally manage, monitor, and govern distributed data, and makes it securely accessible to a variety of analytics and data science tools.
    • While understanding what is happening now has great business value, forward-thinking companies like Tyson Foods are taking things a step further, using real-time analytics integrated with artificial intelligence (AI) and business intelligence (BI) to answer the question, “what might happen in the future?
    • Join us for the first Google Cloud Security Talks of 2022, happening on March 9th. Modernizing SecOps is a top priority for so many organizations. Register to attend and learn how you can enhance your approach to threat detection, investigation and response!
    • Google Cloud introduces their Data Hero series with a profile on Lynn Langit, a data cloud architect, educator, and developer on GCP.
    • Building ML solutions? Check out these guidelines for ensuring quality in each process of the MLOps lifecycle.
    • Eventarc is now Payment Card Industry Data Security Standard (PCI DSS)-compliant.

    Week of Feb 14 - Feb 18, 2022

    • The Google Cloud Retail Digital Pulse-Asia Pacificis an ongoing annual assessment carried out in partnership with IDC Retail Insights to understand the maturity of retail digital transformation in the Asia Pacific Region. The study covers 1304 retailers across eight markets & sub-segments to investigate their digital maturity across five dimensions - strategy, people, data , technology and process to arrive at a 4-stage Digital Pulse Index, with 4 being the most mature. It provides great insights in various stages of digital maturity of asian retailers, their drivers for digitisation, challenges, innovation hotspots and the focus areas with respect to use cases and technologies.
    • Deploying Cloud Memorystore for Redis for any scale: Learn how you can scale Cloud Memorystore for high volume use cases by leveraging client-side sharding. This blog provides a step by step walkthrough which demonstrates how you can adapt your existing application to scale to the highest levels with the help of the Envoy Proxy. Read our blog to learn more.
    • Check out how six SAP customers are driving value with BigQuery.
    • This Black History Month, we're highlighting Black-led startups using Google Cloud to grow their businesses. Check out how DOSS and its co-founder, Bobby Bryant, disrupts the real estate industry with voice search tech and analytics on Google Cloud.
    • Vimeo leverages managed database services from Google Cloud to serve up billions of views around the world each day. Read how it uses Cloud Spanner to deliver a consistent and reliable experience to its users no matter where they are.
    • How can serverless best be leveraged? Can cloud credits be maximized? Are all managed services equal? We dive into top questions for startups.
    • Google introduces Sustainability value pillar in GCP Active Assist solutionto accelerate our industry leadership in Co2 reduction and environmental protection efforts. Intelligent carbon footprint reduction tool is launched in preview.
    • Central States health insurance CIO Pat Moroney shares highs and lows from his career transforming IT. Read more
    • Traffic Director client authorization for proxyless gRPC services is now generally available. Combine with managed mTLS credentials in GKE to centrally manage access between workloads using Traffic Director. Read more.
    • Cloud Functions (2nd gen) is now in public preview. The next generation of our Cloud Functions Functions-as-a-Service platform gives you more features, control, performance, scalability and events sources. Learn more.

    Week of Feb 7 - Feb 11, 2022

    • Now announcing the general availability of the newest instance series in our Compute Optimized family, C2D—powered by 3rd Gen AMD EPYC processors. Read how C2D provides larger instance types, and memory per core configurations ideal for customers with performance-intensive workloads.
    • Digital health startup expands its impact on healthcare equity and diversity with Google Cloud Platform and the Google for Startups Accelerator for Black Founders. Rear more.
    • Storage Transfer Service support for agent pools is now generally available (GA) . You can use agent pools to create isolated groups of agents as a source or sink entity in a transfer job. This enables you to transfer data from multiple data centers and filesystems concurrently, without creating multiple projects for a large transfer spanning multiple filesystems and data centers. This option is available via API, Console, and gcloud transfer CLI.
    • The five trends driving healthcare and life sciences in 2022 will be powered by accessible data, AI, and partnerships.
    • Learn how COLOPL, Minna Bank and 7-Eleven Japan use Cloud Spanner to solve their scalability, performance and digital transformation challenges.

    Week of Jan 31 - Feb 4, 2022

    • Pub/Sub Lite goes regional. Pub/Sub Lite is a high-volume messaging service with ultra-low cost that now offers regional Lite topics, in addition to existing zonal Lite topics. Unlike zonal topics which are located in a single zone, regional topics are asynchronously replicated across two zones. Multi-zone replication protects from zonal failures in the service. Read about it here.

    • Google Workspace is making it easy for employees to bring modern collaboration to work, even if their organizations are still using legacy tools. Essentials Starter is a no-cost offer designed to help people bring the apps they know and love to use in their personal lives to their work life. Learn more.

    • We’re now offering 30 days free access to role-based Google Cloud training with interactive labs and opportunities to earn skill badges to demonstrate your cloud knowledge. Learn more.

    • Security Command Center (SCC) Premium adds support for additional compliance benchmarks, including CIS Google Cloud Computing Foundations 1.2 and OWASP Top 10 2017 & 2021. Learn more about how SCC helps manage and improve your cloud security posture.

    • Storage Transfer Service now offers Preview support transfers from self-managed object storage systems via user-managed agents. With this new feature, customers can seamlessly copy PBs of data from cloud or on-premise object storage to Google Cloud Storage. Object Storage sources must be compatible with Amazon S3 APIs. For customers migrating from AWS S3 to GCS, this feature gives an option to control network routes to Google Cloud. Fill this signup form to access this STS feature.

    Week of Jan 24-Jan 28, 2022

    • Learn how Sabre leveraged a 10-year partnership with Google Cloud to power the travel industry with innovative technology. As Sabre embarked on a cloud transformation, it sought managed database services from Google Cloud that enabled low latency and improved consistency. Sabre discovered how the strengths of both Cloud Spanner and Bigtable supported unique use cases and led to high performance solutions.

    • Storage Transfer Service now offers Preview support for moving data between two filesystems and keeping them in sync on a periodic schedule. This launch offers a managed way to migrate from a self-managed filesystem to Filestore. If you have on-premises systems generating massive amounts of data that needs to be processed in Google Cloud, you can now use Storage Transfer Service to accelerate data transfer from an on-prem filesystem to a cloud filesystem. See Transfer data between POSIX file systems for details.
    • Storage Transfer Service now offers Preview support for preserving POSIX attributes and symlinks when transferring to, from, and between POSIX filesystems. Attributes include the user ID of the owner, the group ID of the owning group, the mode or permissions, the modification time, and the size of the file. See Metadata preservation for details.
    • Bigtable Autoscaling is Generally Available (GA): Bigtable Autoscaling automatically adds or removes capacity in response to the changing demand for your applications. With autoscaling, you only pay for what you need and you can spend more time on your business instead of managing infrastructure.  Learn more.

    Week of Jan 17-Jan 21, 2022

    • Sprinklr and Google Cloud join forces to help enterprises reimagine their customer experience management strategies. Hear more from Nirav Sheth, Nirav Sheth, Director of ISV/Marketplace & Partner Sales.
    • Firestore Key Visualizer is Generally Available (GA): Firestore Key Visualizer is an interactive, performance monitoring tool that helps customers observe and maximize Firestore’s  performance. Learn more.
    • Like many organizations, Wayfair faced the challenge of deciding which cloud databases they should migrate to in order to modernize their business and operations. Ultimately, they chose Cloud SQL and Cloud Spanner because of the databases’ clear path for shifting workloads as well as the flexibility they both provide. Learn how Wayfair was able to migrate quickly while still being able to serve production traffic at scale.

    Week of Jan 10-Jan 14, 2022

    • Start your 2022 New Year’s resolutions by learning at no cost how to use Google Cloud. Read more to find how to take advantage of these training opportunities.
    • 8 megatrends drive cloud adoption—and improve security for all. Google Cloud CISO Phil Venables explains the eight major megatrends powering cloud adoption, and why they’ll continue to make the cloud more secure than on-prem for the foreseeable future. Read more.

    Week of Jan 3-Jan 7, 2022

    • Google Transfer Appliance announces General Availability of online mode. Customers collecting data at edge locations (e.g. cameras, cars, sensors) can offload to Transfer Appliance and stream that data to a Cloud Storage bucket. Online mode can be toggled to send the data to Cloud Storage over the network, or offline by shipping the appliance. Customers can monitor their online transfers for appliances from Cloud Console.

    Week of Dec 27-Dec 31, 2021

    • The most-read blogs about Google Cloud compute, networking, storage and physical infrastructure in 2021. Read more.

    • Top Google Cloud managed container blogs of 2021.

    • Four cloud security trends that organizations and practitioners should be planning for in 2022—and what they should do about them. Read more.

    • Google Cloud announces the top data analytics stories from 2021 including the top three trends and lessons they learned from customers this year. Read more.

    • Explore Google Cloud’s Contact Center AI (CCAI) and its momentum in 2021. Read more.

    • An overview of the innovations that Google Workspace delivered in 2021 for Google Meet. Read more.

    • Google Cloud’s top artificial intelligence and machine learning posts from 2021. Read more.

    • How we’ve helped break down silos, unearth the value of data, and apply that data to solve big problems. Read more.

    • A recap of the year’s infrastructure progress, from impressive Tau VMs, to industry-leading storage capabilities, to major networking leaps. Read more.

    • Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team. Read more.

    • Google Cloud - A cloud built for developers — 2021 year in review. Read more.

    • API management continued to grow in importance in 2021, and Apigee continued to innovate capabilities for customers, new solutions, and partnerships. Read more.

    • Recapping Google’s progress in 2021 toward running on 24/7 carbon-free energy by 2030 — and decarbonizing the electricity system as a whole. Read more.

    Week of Dec 20-Dec 24, 2021

    • And that’s a wrap! After engaging in countless customer interviews, we’re sharing our top 3 lessons learned from our data customers in 2021. Learn what customer data journeys inspired our top picks and what made the cut here.
    • Cloud SQL now shows you minor version information. For more information, see our documentation.
    • Cloud SQL for MySQL now allows you to select your MySQL 8.0 minor version when creating an instance and upgrade MySQL 8.0 minor version. For more information, see our documentation.
    • Cloud SQL for MySQL now supports database auditing. Database auditing lets you track specific user actions in the database, such as table updates, read queries, user privilege grants, and others. To learn more, see MySQL database auditing.

    Week of Dec 12-Dec 17, 2021

    • A CRITICAL VULNERABILITY in a widely used logging library, Apache’s Log4j, has become a global security incident. Security researchers around the globe warn that this could have serious repercussions. Two Google Cloud Blog posts describe how Cloud Armorand Cloud IDS both help mitigate the threat.
    • Take advantage of these ten no-cost trainings before 2022. Check them out here.
    • Deploy Task Queues alongside your Cloud Application: Cloud Tasks is now available in 23 GCP Regions worldwide. Read more.
    • Managed Anthos Service Mesh support for GKE Autopilot (Preview): GKE Autopilot with Managed ASM provides ease of use and simplified administration capabilities, allowing customers to focus on their application, not the infrastructure. Customers can now let Google handle the upgrade and lifecycle tasks for both the cluster and the service mesh. Configure Managed ASM with asmcli experiment in GKE Autopilot cluster.
    • Policy Troubleshooter for BeyondCorp Enterprise is now generally available! Using this feature, admins can triage access failure events and perform the necessary actions to unblock users quickly. Learn more by registering for Google Cloud Security Talks on December 15 and attending the BeyondCorp Enterprise session. The event is free to attend and sessions will be available on-demand.
    • Google Cloud Security Talks, Zero Trust Edition: This week, we hosted our final Google Cloud Security Talks event of the year, focused on all things zero trust. Google pioneered the implementation of zero trust in the enterprise over a decade ago with our BeyondCorp effort, and we continue to lead the way, applying this approach to most aspects of our operations. Check out our digital sessions on-demand to hear the latest updates on Google’s vision for a zero trust future and how you can leverage our capabilities to protect your organization in today’s challenging threat environment.

    Week of Dec 6-Dec 10, 2021

    • 5 key metrics to measure cloud FinOps impact in 2022 and beyond - Learn about the 5 key metrics to effectively measure the impact of Cloud FinOps across your organization and leverage the metrics to gain insights, prioritize on strategic goals, and drive enterprise-wide adoption. Learn more
    • We announced Cloud IDS, our new network security offering, is now generally available. Cloud IDS, built with Palo Alto Networks’ technologies, delivers easy-to-use, cloud-native, managed, network-based threat detection with  industry-leading breadth and security efficacy. To learn more, and request a 30 day trial credit, see the Cloud IDS webpage.

    Week of Nov 29-Dec 3, 2021

    • Join Cloud Learn, happening from Dec. 8-9: This interactive learning event will have live technical demos, Q&As, career development workshops, and more covering everything from Google Cloud fundamentals to certification prep. Learn more.

    • Get a deep dive into BigQuery Administrator Hub– With BigQuery Administrator Hub you can better manage BigQuery at scale with Resource Charts and Slot Estimator Administrators. Learn more about these tools and just how easy they are to usehere.

    • New data and AI in Media blog - How data and AI can help media companies better personalize; and what to watch out for. We interviewed Googlers, Gloria Lee, Executive Account Director of Media & Entertainment, and John Abel, Technical Director for the Office of the CTO, to share exclusive insights on how media organizations should think about and ways to make the most out of their data in the new era of direct-to-consumer. Watch our video interview with Gloria and John and read more.

    • Datastream is now generally available (GA): Datastream, a serverless change data capture (CDC) and replication service, allows you to synchronize data across heterogeneous databases, storage systems, and applications reliably and with minimal latency to support real-time analytics, database replication, and event-driven architectures. Datastream currently supports CDC ingestion from Oracle and MySQL to Cloud Storage, with additional sources and destinations coming in the future. Datastream integrates with Dataflow and Cloud Data Fusion to deliver real time replication to a wide range of destinations, including BigQuery, Cloud Spanner and Cloud SQL. Learn more.

    Week of Nov 22 - Nov 26, 2021

    • Security Command Center (SCC) launches new mute findings capability: We’re excited to announce a new “Mute Findings” capability in SCC that helps you gain operational efficiencies by effectively managing the findings volume based on your organization’s policies and requirements. SCC presents potential security risks in your cloud environment as ‘findings’ across misconfigurations, vulnerabilities, and threats. With the launch of ‘mute findings’ capability, you gain a way to reduce findings volume and focus on the security issues that are highly relevant to you and your organization. To learn more, read this blog post and watch thisshort demo video.

    Week of Nov 15 - Nov 19, 2021

    • Cloud Spanner is our distributed, globally scalable SQL database service that decouples compute from storage, which makes it possible to scale processing resources separately from storage. This means that horizontal upscaling is possible with no downtime for achieving higher performance on dimensions such as operations per second for both reads and writes. The distributed scaling nature of Spanner’s architecture makes it an ideal solution for unpredictable workloads such as online games. Learn how you can get started developing global multiplayer games using Spanner.

    • New Dataflow templates for Elasticsearch releasedto help customers process and export Google Cloud data into their Elastic Cloud. You can now push data from Pub/Sub, Cloud Storage or BigQuery into your Elasticsearch deployments in a cloud-native fashion. Read more for a deep dive on how to set up a Dataflow streaming pipeline to collect and export your Cloud Audit logs into Elasticsearch, and analyze them in Kibana UI.

    • We’re excited to announce the public preview of Google Cloud Managed Service for Prometheus, a new monitoring offering designed for scale and ease of use that maintains compatibility with the open-source Prometheus ecosystem. While Prometheus works well for many basic deployments, managing Prometheus can become challenging at enterprise scale. Learn more about the service in our blog and on the website.

    Week of Nov 8 - Nov 12, 2021

    Week of Nov 1 - Nov 5, 2021

    • Time to live (TTL) reduces storage costs, improves query performance, and simplifies data retention in Cloud Spanner by automatically removing unneeded data based on user-defined policies. Unlike custom scripts or application code, TTL is fully managed and designed for minimal impact on other workloads. TTL is generally available today in Spanner at no additional cost. Read more.
    • New whitepaper available: Migrating to .NET Core/5+ on Google Cloud - This free whitepaper, written for .NET developers and software architects who want to modernize their .NET Framework applications, outlines the benefits and things to consider when migrating .NET Framework apps to .NET Core/5+ running on Google Cloud. It also offers a framework with suggestions to help you build a strategy for migrating to a fully managed Kubernetes offering or to Google serverless. Download the free whitepaper.
    • Export from Google Cloud Storage: Storage Transfer Service now offers Preview support for exporting data from Cloud Storage to any POSIX file system. You can use this bidirectional data movement capability to move data in and out of Cloud Storage, on-premises clusters, and edge locations including Google Distributed Cloud. The service provides built-in capabilities such as scheduling, bandwidth management, retries, and data integrity checks that simplifies the data transfer workflow. For more information, see Download data from Cloud Storage.
    • Document Translation is now GA! Translate documents in real-time in 100+ languages, and retain document formatting. Learn more about new features and see a demo on how Eli Lilly translates content globally.
    • Announcing the general availability of Cloud Asset Inventory console - We’re excited to announce the general availability of the new Cloud Asset Inventory user interface. In addition to all the capabilities announced earlier in Public Preview, the general availability release provides powerful search and easy filtering capabilities. These capabilities enable you to view details of resources and IAM policies, machine type and policy statistics, and insights into your overall cloud footprint. Learn more about these new capabilities by using the searching resources and searching IAM policies guides. You can get more information about Cloud Asset Inventory using our product documentation.

    Week of Oct 25 - Oct 29, 2021

    • BigQuery table snapshots are now generally available. A table snapshot is a low-cost, read-only copy of a table's data as it was at a particular time.
    • By establishing a robust value measurement approach to track and monitor the business value metrics toward business goals, we are bringing technology, finance, and business leaders together through the discipline of Cloud FinOps to show how digital transformation is enabling the organization to create new innovative capabilities and generate top-line revenue. Learn more.
    • We’ve announced BigQuery Omni, a new multicloud analytics service that allows data teams to perform cross-cloud analytics - across AWS, Azure, and Google Cloud - all from one viewpoint. Learn how BigQuery Omni works and what data and business challenges it solves here.

    Week of Oct 18 - Oct 22, 2021

    • Available now are our newest T2D VMs family based on 3rd Generation AMD EPYC processors. Learn more.
    • In case you missed it — top AI announcements from Google Cloud Next. Catch up on what’s new, see demos, and hear from our customers about how Google Cloud is making AI more accessible, more focused on business outcomes, and fast-tracking the time-to-value.
    • Too much to take in at Google Cloud Next 2021? No worries - here’s a breakdown of the biggest announcements at the 3-day event.
    • Check out the second revision of Architecture Framework, Google Cloud’s collection of canonical best practices.

    Week of Oct 4 - Oct 8, 2021

    • We’re excited to announce Google Cloud’s new goal of equipping more than 40 million people with Google Cloud skills. To help achieve this goal, we’re offering no-cost access to all our training content this month. Find out more here
    • Support for language repositories in Artifact Registry is now generally available. Artifact Registry allows you to store all your language-specific artifacts in one place. Supported package types include Java, Node and Python. Additionally, support for Linux packages is in public preview. Learn more.
    • Want to know what’s the latest with Google ML-Powered intelligence service Active Assist and how to learn more about it at Next’21? Check out this blog.

    Week of Sept 27 - Oct 1, 2021

    • Announcing the launch of Speaker ID. In 2020, customer preference for voice calls increased by 10 percentage points (to 43%) and was by far the most preferred service channel. But most callers still need to pass through archaic authentication processes which slows down the time to resolution and burns through valuable agent time. Speaker ID, from Google Cloud, brings ML-based speaker identification directly to customers and contact center partners, allowing callers to authenticate over the phone, using their own voice. Learn more.
    • Your guide to all things AI & ML at Google Cloud Next. Google Cloud Next is coming October 12–14 and if you’re interested in AI & ML, we’ve got you covered. Tune in to hear about real use cases from companies like Twitter, Eli Lilly, Wayfair, and more. We’re also excited to share exciting product news and hands on AI learning opportunities. Learn more about AI at Next and register for free today!
    • It is now simple to use Terraform to configure Anthos features on your GKE clusters. Check out part two of this series which explores adding Policy Controller audits to our Config Sync managed cluster. Learn more.

    Week of Sept 20 - Sept 24, 2021

    • Announcing the webinar, Powering market data through cloud and AI/ML. We’re sponsoring a Coalition Greenwich webinar on September 23rd where we’ll discuss the findings of our upcoming study on how market data delivery and consumption is being transformed by cloud and AI. Moderated by Coalition Greenwich, the panel will feature Trey Berre from CME Group, Brad Levy from Symphony, and Ulku Rowe representing Google Cloud. Register here.
    • New research from Google Cloud reveals five innovation trends for market data. Together with Coalition Greenwich we surveyed exchanges, trading systems, data aggregators, data producers, asset managers, hedge funds, and investment banks to examine both the distribution and consumption of market data and trading infrastructure in the cloud. Learn more about our findings here.
    • If you are looking for a more automated way to manage quotas over a high number of projects, we are excited to introduce a Quota Monitoring Solution from Google Cloud Professional Services. This solution benefits customers who have many projects or organizations and are looking for an easy way to monitor the quota usage in a single dashboard and use default alerting capabilities across all quotas.

      Week of Sept 13 - Sept 17, 2021

      • New storage features help ensure data is never lost. We are announcing extensions to our popular Cloud Storage offering, and introducing two new services, Filestore Enterprise, and Backup for Google Kubernetes Engine (GKE). Together, these new capabilities will make it easier for you to protect your data out-of-the box, across a wide variety of applications and use cases: Read the full article.
      • API management powers sustainable resource management. Water, waste, and energy solutions company, Veolia, uses APIs and API Management platform Apigee to build apps and help their customers build their own apps, too. Learn from their digital and API-first approach here.
      • To support our expanding customer base in Canada, we’re excited to announce that the new Google Cloud Platform region in Toronto is now open. Toronto is the 28th Google Cloud region connected via our high-performance network, helping customers better serve their users and customers throughout the globe. In combination with Montreal, customers now benefit from improved business continuity planning with distributed, secure infrastructure needed to meet IT and business requirements for disaster recovery, while maintaining data sovereignty.
      • Cloud SQL now supports custom formatting controls for CSVs.When performing admin exports and imports, users can now select custom characters for field delimiters, quotes, escapes, and other characters. For more information, see our documentation.

      Week of Sept 6 -Sept 10, 2021

      • Hear how Lowe’s SRE was able to reduce their Mean Time to Recovery (MTTR) by over 80% after adopting Google’s Site Reliability Engineering practices and Google Cloud’s operations suite.

      Week of  Aug 30 -Sept 3, 2021

      • A what’s new blog in the what’s new blog? Yes, you read that correctly. Google Cloud data engineers are always hard at work maintaining the hundreds of dataset pipelines that feed into our public datasets repository, but they’re also regularly bringing new ones into the mix. Check out our newest featured datasets and catch a few best practices in our living blog: What are the newest datasets in Google Cloud?
      • Migration success with Operational Health Reviews from Google Cloud’s Professional Service Organization - Learn how Google Cloud’s Professional Services Org is proactively and strategically guiding customers to operate effectively and efficiently in the Cloud, both during and after their migration process.
      • Learn how we simplified monitoring for Google Cloud VMware Engine and Google Cloud operations suite. Read more.

      Week of Aug 23 -Aug 27, 2021

      • Google Transfer Appliance announces preview of online mode. Customers are increasingly collecting data that needs to quickly be transferred to the cloud. Transfer Appliances are being used to quickly offload data from sources (e.g. cameras, cars, sensors) and can now stream that data to a Cloud Storage bucket. Online mode can be toggled as data is copied into the appliance and either send the data offline by shipping the appliance to Google or copy data to Cloud Storage over the network. Read more.
      • Topic retention for Cloud Pub/Sub is now Generally Available. Topic retention is the most comprehensive and flexible way available to retain Pub/Sub messages for message replay. In addition to backing up all subscriptions connected to the topic, new subscriptions can now be initialized from a timestamp in the past. Learn more about the feature here.
      • Vertex Predictions now supports private endpoints for online prediction. Through VPC Peering, Private Endpoints provide increased security and lower latency when serving ML models. Read more.

      Week of Aug 16 -Aug 20, 2021

      • Look for us to take security one step further by adding authorization features for service-to-service communications for gRPC proxyless services, as well as to support other deployment models, where proxyless gRPC services are running somewhere other than GKE, for example Compute Engine. We hope you'll join us and check out the setup guide and give us feedback.
      • Cloud Run now supports VPC Service Controls. You can now protect your Cloud Run services against data exfiltration by using VPC Service Controls in conjunction with Cloud Run’s ingress and egress settings. Read more.
      • Read how retailers are leveraging Google Cloud VMware Engine to move their on-premises applications to the cloud, where they can achieve the scale, intelligence, and speed required to stay relevant and competitive. Read more.
      • A series of new features for BeyondCorp Enterprise, our zero trust offering. We now offer native support for client certificates for eight types of VPC-SC resources. We are also announcing general availability of the on-prem connector, which allows users to secure HTTP or HTTPS based on-premises applications outside of Google Cloud. Additionally, three new BeyondCorp attributes are available in Access Context Manager as part of a public preview. Customers can configure custom access policies based on time and date, credential strength, and/or Chrome browser attributes. Read more about these announcements here.
      • We are excited to announce that Google Cloud, working with its partners NAG and DDN, demonstrated the highest performing Lustre file system on the IO500 ranking of the fastest HPC storage systems — quite a feat considering Lustre is one of the most widely deployed HPC file systems in the world.  Read the full article.
      • The Storage Transfer Service for on-premises data API is now available in Preview. Now you can use RESTful APIs to automate your on-prem-to-cloud transfer workflows.  Storage Transfer Service is a software service to transfer data over a network. The service provides built-in capabilities such as scheduling, bandwidth management, retries, and data integrity checks that simplifies the data transfer workflow.
      • It is now simple to use Terraform to configure Anthos features on your GKE clusters. This is the first part of the 3 part series that describes using Terraform to enable Config Sync.  For platform administrators,  this natural, IaC approach improves auditability and transparency and reduces risk of misconfigurations or security gaps. Read more.
      • In this commissioned study, “Modernize With AIOps To Maximize Your Impact”, Forrester Consulting surveyed organizations worldwide to better understand how they’re approaching artificial intelligence for IT operations (AIOps) in their cloud environments, and what kind of benefits they’re seeing. Read more.
      • If your organization or development environment has strict security policies which don’t allow for external IPs, it can be difficult to set up a connection between a Private Cloud SQL instance and a Private IP VM. This article contains clear instructions on how to set up a connection from a private Compute Engine VM to a private Cloud SQL instance using a private service connection and the mysqlsh command line tool.

      Week of Aug 9 -Aug 13, 2021

      • Compute Engine users have a new, updated set of VM-level “in-context” metrics, charts, and logs to correlate signals for common troubleshooting scenarios across CPU, Disk, Memory, Networking, and live Processes.  This brings the best of Google Cloud’s operations suite directly to the Compute Engine UI. Learn more.
      • ​​Pub/Sub to Splunk Dataflow template has been updatedto address multiple enterprise customer asks, from improved compatibility with Splunk Add-on for Google Cloud Platform, to more extensibility with user-defined functions (UDFs), and general pipeline reliability enhancements to tolerate failures like transient network issues when delivering data to Splunk. Read more to learn about how to take advantage of these latest features. Read more.
      • Google Cloud and NVIDIA have teamed up to make VR/AR workloads easier, faster to create and tetherless! Read more.
      • Register for the Google Cloud Startup Summit, September 9, 2021 at goo.gle/StartupSummit for a digital event filled with inspiration, learning, and discussion. This event will bring together our startup and VC community to discuss the latest trends and insights, headlined by a keynote by Astro Teller, Captain of Moonshots at X the moonshot factory. Additionally, learn from a variety of technical and business sessions to help take your startup to the next level.
      • Google Cloud and Harris Poll healthcare research reveals COVID-19 impacts on healthcare technology. Learn more.
      • Partial SSO is now available for public preview. If you use a 3rd party identity provider to single sign on into Google services, Partial SSO allows you to identify a subset of your users to use Google / Cloud Identity as your SAML SSO identity provider (short video and demo).

      Week of Aug 2-Aug 6, 2021

      • Gartner named Google Cloud a Leader in the 2021 Magic Quadrant for Cloud Infrastructure and Platform Services, formerly Infrastructure as a Service. Learn more.
      • Private Service Connect is now generally available. Private Service Connect lets you create private and secure connections to Google Cloud and third-party services with service endpoints in your VPCs. Read more.
      • 30 migration guides designed to help you identify the best ways to migrate, which include meeting common organizational goals like minimizing time and risk during your migration, identifying the most enterprise-grade infrastructure for your workloads, picking a cloud that aligns with your organization’s sustainability goals, and more. Read more.

      Week of Jul 26-Jul 30, 2021

      • This week we hosting our Retail & Consumer Goods Summit, a digital event dedicated to helping leading retailers and brands digitally transform their business. Read more about our consumer packaged goods strategy and a guide to key summit content for brands in this blog from Giusy Buonfantino, Google Cloud’s Vice President of CPG.

      • We’re hosting our Retail & Consumer Goods Summit, a digital event dedicated to helping leading retailers and brands digitally transform their business. Read more.

      • See how IKEA uses Recommendations AI to provide customers with more relevant product information. Read more.

      • ​​Google Cloud launches a career program for people with autism designed to hire and support more talented people with autism in the rapidly growing cloud industry. Learn more

      • Google Cloud follows new API stability tenets that work to minimize unexpected deprecations to our Enterprise APIs. Read more.

      Week of Jul 19-Jul 23, 2021

      • Register and join us for Google Cloud Next, October 12-14, 2021 at g.co/CloudNext for a fresh approach to digital transformation, as well as a few surprises. Next ‘21 will be a fully customizable digital adventure for a more personalized learning journey. Find the tools and training you need to succeed. From live, interactive Q&As and informative breakout sessions to educational demos and real-life applications of the latest tech from Google Cloud. Get ready to plug into your cloud community, get informed, and be inspired. Together we can tackle today’s greatest business challenges, and start solving for what’s next.
      • "Application Innovation" takes a front row seat this year– To stay ahead of rising customer expectations and the digital and in-person hybrid landscape, enterprises must know what application innovation means and how to deliver this type of innovation with a small piece of technology that might surprise you. Learn more about the three pillars of app innovation here.
      • We announced Cloud IDS, our new network security offering, which is now available in preview. Cloud IDS delivers easy-to-use, cloud-native, managed, network-based threat detection. With Cloud IDS, customers can enjoy a Google Cloud-integrated experience, built with Palo Alto Networks’ industry-leading threat detection technologies to provide high levels of security efficacy. Learn more.
      • Key Visualizer for Cloud Spanner is now generally available. Key Visualizer is a new interactive monitoring tool that lets developers and administrators analyze usage patterns in Spanner. It reveals trends and outliers in key performance and resource metrics for databases of any size, helping to optimize queries and reduce infrastructure costs. See it in action.
      • The market for healthcare cloud is projected to grow 43%. This means a need for better tech infrastructure, digital transformation & Cloud tools. Learn how Google Cloud Partner Advantage partners help customers solve business challenges in healthcare.

      Week of Jul 12-Jul 16, 2021

      • Simplify VM migrations with Migrate for Compute Engine as a Service: delivers a Google-managed cloud service that enables simple, frictionless, and large-scale enterprise migrations of virtual machines to Google Compute Engine with minimal downtime and risk. API-driven and integrated into your Google Cloud console for ease of use, this service uses agent-less replication to copy data without manual intervention and without VPN requirements. It also enables you to launch non-disruptive validations of your VMs prior to cutover.  Rapidly migrate a single application or execute a sprint with hundred systems using migration groups with confidence. Read more here.
      • The Google Cloud region in Delhi NCR is now open for business, ready to host your workloads. Learn more and watch the region launch event here.
      • Introducing Quilkin: the open-source game server proxy. Developed in collaboration with Embark Studios, Quilkin is an open source UDP proxy, tailor-made for high performance real-time multiplayer games. Read more.
      • We’re making Google Glass on Meet available to a wider network of global customers. Learn more.
      • Transfer Appliance supports Google Managed Encryption Keys — We’re announcing the support for Google Managed Encryption Keys with Transfer Appliance, this is in addition to the currently available Customer Managed Encryption Keys feature. Customers have asked for the Transfer Appliance service to create and manage encryption keys for transfer sessions to improve usability and maintain security. The Transfer Appliance Service can now manage the encryption keys for the customers who do not wish to handle a key themselves. Learn more about Using Google Managed Encryption Keys.

      • UCLA builds a campus-wide API program– With Google Cloud's API management platform, Apigee, UCLA created a unified and strong API foundation that removes data friction that students, faculty, and administrators alike face. This foundation not only simplifies how various personas connect to data, but also encourages more innovations in the future. Learn their story.

      • An enhanced region picker makes it easy to choose a Google Cloud region with the lowest CO2 outputLearn more.
      • Amwell and Google Cloud explore five ways telehealth can help democratize access to healthcareRead more.
      • Major League Baseball and Kaggle launch ML competition to learn about fan engagement. Batter up!
      • We’re rolling out general support of Brand Indicators for Message Identification (BIMI) in Gmail within Google Workspace. Learn more.

      • Learn how DeNA Sports Business created an operational status visualization system that helps determine whether live event attendees have correctly installed Japan’s coronavirus contact tracing app COCOA.

      • Google Cloud CAS provides a highly scalable and available private CA to address the unprecedented growth in certificates in the digital world. Read more about CAS.

      Week of Jul 5-Jul 9, 2021

      • Google Cloud and Call of Duty League launch ActivStat to bring fans, players, and commentators the power of competitive statistics in real-time. Read more.
      • Building applications is a heavy lift due to the technical complexity, which includes the complexity of backend services that are used to manage and store data. Firestore alters this by having Google Cloud manage your backend complexity through a complete backend-as-a-service! Learn more.
      • Google Cloud’s new Native App Development skills challenge lets you earn badges that demonstrate your ability to create cloud-native apps. Read more and sign up.

      Week of Jun 28-Jul 2, 2021

      • Storage Transfer Service now offers preview support for Integration with AWS Security Token Service. Security conscious customers can now use Storage Transfer Service to perform transfers from AWS S3 without passing any security credentials. This release will alleviate the security burden associated with passing long-term AWS S3 credentials, which have to be rotated or explicitly revoked when they are no longer needed. Read more.
      • The most popular and surging Google Search terms are now available in BigQuery as a public dataset. View the Top 25 and Top 25 rising queries from Google Trends from the past 30-days, including 5 years of historical data across the 210 Designated Market Areas (DMAs) in the US. Learn more.
      • A new predictive autoscaling capability lets you add additional Compute Engine VMs in anticipation of forecasted demand. Predictive autoscaling is generally available across all Google Cloud regions. Read more or consult the documentation for more information on how to configure, simulate and monitor predictive autoscaling.
      • Messages by Google is now the default messaging app for all AT&T customers using Android phones in the United States. Read more.
      • TPU v4 Pods will soon be available on Google Cloud, providing the most powerful publicly available computing platform for machine learning training. Learn more.
      • Cloud SQL for SQL Server has addressed multiple enterprise customer asks with the GA releases of both SQL Server 2019 and Active Directory integration, as well as the Preview release of Cross Region Replicas.  This set of releases work in concert to allow customers to set up a more scalable and secure managed SQL Server environment to address their workloads’ needs. Read more.

      Week of Jun 21-Jun 25, 2021

      • Simplified return-to-office with no-code technologyWe've just released a solution to your most common return-to-office headaches: make a no-code app customized to solve your business-specific challenges. Learn how to create an automated app where employees can see office room occupancy, check what desks are reserved or open, review disinfection schedules, and more in this blog tutorial.
      • New technical validation whitepaper for running ecommerce applications—Enterprise Strategy Group's analyst outlines the challenges of organizations running ecommerce applications and how Google Cloud helps to mitigate those challenges and handle changing demands with global infrastructure solutions. Download the whitepaper.
      • The fullagendafor Google for Games Developer Summit on July 12th-13th, 2021 is now available. A free digital event with announcements from teams including Stadia, Google Ads, AdMob, Android, Google Play, Firebase, Chrome, YouTube, and Google Cloud. Hear more about how Google Cloud technology creates opportunities for gaming companies to make lasting enhancements for players and creatives. Register at g.co/gamedevsummit
      • BigQuery row-level security is now generally available, giving customers a way to control access to subsets of data in the same table for different groups of users. Row-level security (RLS) extends the principle of least privilege access and enables fine-grained access control policies in BigQuery tables. BigQuery currently supports access controls at the project-, dataset-, table- and column-level. Adding RLS to the portfolio of access controls now enables customers to filter and define access to specific rows in a table based on qualifying user conditions—providing much needed peace of mind for data professionals.
      • Transfer from Azure ADLS Gen 2: Storage Transfer Service offers Preview support for transferring data from Azure ADLS Gen 2 to Google Cloud Storage. Take advantage of a scalable, serverless service to handle data transfer. Read more.
      • reCAPTCHA V2 and V3 customers can now migrate site keys to reCAPTCHA Enterprise in under 10 minutes and without making any code changes. Watch our Webinar to learn more. 
      • Bot attacks are the biggest threat to your business that you probably haven’t addressed yet. Check out our Forbes article to see what you can do about it.

      Week of Jun 14-Jun 18, 2021

      • A new VM family for scale-out workloads—New AMD-based Tau VMs offer 56% higher absolute performance and 42% higher price-performance compared to general-purpose VMs from any of the leading public cloud vendors. Learn more.
      • New whitepaper helps customers plot their cloud migrations—Our new whitepaper distills the conversations we’ve had with CIOs, CTOs, and their technical staff into several frameworks that can help cut through the hype and the technical complexity to help devise the strategy that empowers both the business and IT. Read more or download the whitepaper.
      • Ubuntu Pro lands on Google Cloud—The general availability of Ubuntu Pro images on Google Cloud gives customers an improved Ubuntu experience, expanded security coverage, and integration with critical Google Cloud features. Read more.
      • Navigating hybrid work with a single, connected experience in Google Workspace—New additions to Google Workspace help businesses navigate the challenges of hybrid work, such as Companion Mode for Google Meet calls. Read more.
      • Arab Bank embraces Google Cloud technology—This Middle Eastern bank now offers innovative apps and services to their customers and employees with Apigee and Anthos. In fact, Arab Bank reports over 90% of their new-to-bank customers are using their mobile apps. Learn more.
      • Google Workspace for the Public Sector Sector events—This June, learn about Google Workspace tips and tricks to help you get things done. Join us for one or more of our learning events tailored for government and higher education users. Learn more.

      Week of Jun 7-Jun 11, 2021

      • The top cloud capabilities industry leaders want for sustained innovation—Multicloud and hybrid cloud approaches, coupled with open-source technology adoption, enable IT teams to take full advantage of the best cloud has to offer. Our recent study with IDG shows just how much of a priority this has become for business leaders. Read more or download the report.
      • Announcing the Firmina subsea cable—Planned to run from the East Coast of the United States to Las Toninas, Argentina, with additional landings in Praia Grande, Brazil, and Punta del Este, Uruguay, Firmina will be the longest open subsea cable in the world capable of running entirely from a single power source at one end of the cable if its other power source(s) become temporarily unavailable—a resilience boost at a time when reliable connectivity is more important than ever. Read more.
      • New research reveals what’s needed for AI acceleration in manufacturing—According to our data, which polled more than 1,000 senior manufacturing executives across seven countries, 76% have turned to digital enablers and disruptive technologies due to the pandemic such as data and analytics, cloud, and artificial intelligence (AI). And 66% of manufacturers who use AI in their day-to-day operations report that their reliance on AI is increasing. Read more or download the report.
      • Cloud SQL offers even faster maintenance—Cloud SQL maintenance is zippier than ever. MySQL and PostgreSQL planned maintenance typically lasts less than 60 seconds and SQL Server maintenance typically lasts less than 120 seconds. You can learn more about maintenance here.
      • Simplifying Transfer Appliance configuration with Cloud Setup Application—We’re announcing the availability of the Transfer Appliance Cloud Setup Application. This will use the information you provide through simple prompts and configure your Google Cloud permissions, preferred Cloud Storage bucket, and Cloud KMS key for your transfer. Several cloud console based manual steps are now simplified with a command line experience. Read more
      • Google Cloud VMware Engine is now HIPAA compliant—As of April 1, 2021, Google Cloud VMware Engine is covered under the Google Cloud Business Associate Agreement (BAA), meaning it has achieved HIPAA compliance. Healthcare organizations can now migrate and run their HIPAA-compliant VMware workloads in a fully compatible VMware Cloud Verified stack running natively in Google Cloud with Google Cloud VMware Engine, without changes or re-architecture to tools, processes, or applications. Read more.
      • Introducing container-native Cloud DNS—Kubernetes networking almost always starts with a DNS request. DNS has broad impacts on your application and cluster performance, scalability, and resilience. That is why we are excited to announce the release of container-native Cloud DNS—the native integration of Cloud DNS with Google Kubernetes Engine (GKE) to provide in-cluster Service DNS resolution with Cloud DNS, our scalable and full-featured DNS service. Read more.
      • Welcoming the EU’s new Standard Contractual Clauses for cross-border data transfers—Learn how we’re incorporating the new Standard Contractual Clauses (SCCs) into our contracts to help protect our customers’ data and meet the requirements of European privacy legislation. Read more.
      • Lowe’s meets customer demand with Google SRE practices—Learn how Low’s has been able to increase the number of releases they can support by adopting Google’s Site Reliability Engineering (SRE) framework and leveraging their partnership with Google Cloud. Read more.
      • What’s next for SAP on Google Cloud at SAPPHIRE NOW and beyond—As SAP’s SAPPHIRE conference begins this week, we believe businesses have a more significant opportunity than ever to build for their next decade of growth and beyond. Learn more on how we’re working together with our customers, SAP, and our partners to support this transformation. Read more.
      • Support for Node.js, Python and Java repositories for Artifact Registrynow in Preview–With today’s announcement, you can not only use Artifact Registry to secure and distribute container images, but also manage and secure your other software artifacts. Read more.
      • What’s next for SAP on Google Cloud at SAPPHIRE NOW and beyond—As SAP’s SAPPHIRE conference begins this week, we believe businesses have a more significant opportunity than ever to build for their next decade of growth and beyond. Learn more on how we’re working together with our customers, SAP, and our partners to support this transformation. Read more.
      • Google named a Leader in The Forrester Wave: Streaming Analytics, Q2 2021 report–Learn about the criteria where Google Dataflow was rated 5 out 5 and why this matters for our customers here.
      • Applied ML Summit this Thursday, June 10–Watch our keynote to learn about predictions for machine learning over the next decade. Engage with distinguished researchers, leading practitioners, and Kaggle Grandmasters during our live Ask Me Anything session. Take part in our modeling workshops to learn how you can iterate faster, and deploy and manage your models with confidence–no matter your level of formal computer science training. Learn how to develop and apply your professional skills, grow your abilities at the pace of innovation, and take your career to the next level. Register now.

      Week of May 31-Jun 4, 2021

      • Security Command Center now supports CIS 1.1 benchmarks and granular access controlSecurity Command Center (SCC) now supports CIS benchmarks for Google Cloud Platform Foundation v1.1, enabling you to monitor and address compliance violations against industry best practices in your Google Cloud environment. Additionally, SCC now supports fine-grained access control for administrators that allows you to easily adhere to the principles of least privilege—restricting access based on roles and responsibilities to reduce risk and enabling broader team engagement to address security. Read more.
      • Zero-trust managed security for services with Traffic Director–We created Traffic Director to bring to you a fully managed service mesh product that includes load balancing, traffic management and service discovery. And now, we’re happy to announce the availability of a fully-managed zero-trust security solution using Traffic Director with Google Kubernetes Engine (GKE) and Certificate Authority (CA) Service. Read more.
      • How one business modernized their data warehouse for customer success–PedidosYa migrated from their old data warehouse to Google Cloud's BigQuery. Now with BigQuery, the Latin American online food ordering company has reduced the total cost per query by 5x. Learn more.
      • Announcing new Cloud TPU VMs–New Cloud TPU VMs make it easier to use our industry-leading TPU hardware by providing direct access to TPU host machines, offering a new and improved user experience to develop and deploy TensorFlow, PyTorch, and JAX on Cloud TPUs. Read more.
      • Introducing logical replication and decoding for Cloud SQL for PostgreSQL–We’re announcing the public preview of logical replication and decoding for Cloud SQL for PostgreSQL. By releasing those capabilities and enabling change data capture (CDC) from Cloud SQL for PostgreSQL, we strengthen our commitment to building an open database platform that meets critical application requirements and integrates seamlessly with the PostgreSQL ecosystem. Read more.
      • How 6 businesses are transforming with SAP on Google Cloud–Thousands of organizations globally rely on SAP for their most mission critical workloads. And for many Google Cloud customers, part of a broader digital transformation journey has included accelerating the migration of these essential SAP workloads to Google Cloud for greater agility, elasticity, and uptime. Read six of their stories.

      Week of May 24-May 28, 2021

      • Google Cloud for financial services: driving your transformation cloud journey–As we welcome the industry to our Financial Services Summit, we’re sharing more on how Google Cloud accelerates a financial organization’s digital transformation through app and infrastructure modernization, data democratization, people connections, and trusted transactions. Read more or watch the summit on demand.
      • Introducing Datashare solution for financial services–We announced the general availability of Datashare for financial services, a new Google Cloud solution that brings together the entire capital markets ecosystem—data publishers and data consumers—to exchange market data securely and easily. Read more.
      • Announcing Datastream in PreviewDatastream, a serverless change data capture (CDC) and replication service, allows enterprises to synchronize data across heterogeneous databases, storage systems, and applications reliably and with minimal latency to support real-time analytics, database replication, and event-driven architectures. Read more.
      • Introducing Dataplex: An intelligent data fabric for analytics at scaleDataplex provides a way to centrally manage, monitor, and govern your data across data lakes, data warehouses and data marts, and make this data securely accessible to a variety of analytics and data science tools. Read more
      • Announcing Dataflow Prime–Available in Preview in Q3 2021, Dataflow Prime is a new platform based on a serverless, no-ops, auto-tuning architecture built to bring unparalleled resource utilization and radical operational simplicity to big data processing. Dataflow Prime builds on Dataflow and brings new user benefits with innovations in resource utilization and distributed diagnostics. The new capabilities in Dataflow significantly reduce the time spent on infrastructure sizing and tuning tasks, as well as time spent diagnosing data freshness problems. Read more.
      • Secure and scalable sharing for data and analytics with Analytics Hub–With Analytics Hub, available in Preview in Q3, organizations get a rich data ecosystem by publishing and subscribing to analytics-ready datasets; control and monitoring over how their data is being used; a self-service way to access valuable and trusted data assets; and an easy way to monetize their data assets without the overhead of building and managing the infrastructure. Read more.
      • Cloud Spanner trims entry cost by 90%–Coming soon to Preview, granular instance sizing in Spanner lets organizations run workloads at as low as 1/10th the cost of regular instances, equating to approximately $65/month. Read more.
      • Cloud Bigtable lifts SLA and adds new security features for regulated industries–Bigtable instances with a multi-cluster routing policy across 3 or more regions are now covered by a 99.999% monthly uptime percentage under the new SLA. In addition, new Data Access audit logs can help determine whether sensitive customer information has been accessed in the event of a security incident, and if so, when, and by whom. Read more.
      • Build a no-code journaling app–In honor of Mental Health Awareness Month, Google Cloud's no-code application development platform, AppSheet, demonstrates how you can build a journaling app complete with titles, time stamps, mood entries, and more. Learn how with this blog and video here.
      • New features in Security Command Center—On May 24th, Security Command Center Premium launched the general availability of granular access controls at project- and folder-level and Center for Internet Security (CIS) 1.1 benchmarks for Google Cloud Platform Foundation. These new capabilities enable organizations to improve their security posture and efficiently manage risk for their Google Cloud environment. Learn more.
      • Simplified API operations with AI–Google Cloud's API management platform Apigee applies Google's industry leading ML and AI to your API metadata. Understand how it works with anomaly detection here.
      • This week: Data Cloud and Financial Services Summits–Our Google Cloud Summit series begins this week with the Data Cloud Summit on Wednesday May 26 (Global). At this half-day event, you’ll learn how leading companies like PayPal, Workday, Equifax, and many others are driving competitive differentiation using Google Cloud technologies to build their data clouds and transform data into value that drives innovation. The following day, Thursday May 27 (Global & EMEA) at the Financial Services Summit, discover how Google Cloud is helping financial institutions such as PayPal, Global Payments, HSBC, Credit Suisse, AXA Switzerland and more unlock new possibilities and accelerate business through innovation. Read more and explore the entire summit series.
      • Announcing the Google for Games Developer Summit 2021 on July 12th-13th–With a surge of new gamers and an increase in time spent playing games in the last year, it’s more important than ever for game developers to delight and engage players. To help developers with this opportunity, the games teams at Google are back to announce the return of the Google for Games Developer Summit 2021 on July 12th-13th. Hear from experts across Google about new game solutions they’re building to make it easier for you to continue creating great games, connecting with players and scaling your business. Registration is free and open to all game developers. Register for the free online event at g.co/gamedevsummit to get more details in the coming weeks. We can’t wait to share our latest innovations with the developer community. Learn more.

      Week of May 17-May 21, 2021

      • Best practices to protect your organization against ransomware threats–For more than 20 years Google has been operating securely in the cloud, using our modern technology stack to provide a more defensible environment that we can protect at scale. While the threat of ransomware isn’t new, our responsibility to help protect you from existing or emerging threats never changes. In our recent blog post, we shared guidance on how organizations can increase their resilience to ransomware and how some of our Cloud products and services can help. Read more.

      • Forrester names Google Cloud a Leader in Unstructured Data Security Platforms–Forrester Research has named Google Cloud a Leader in The Forrester Wave: Unstructured Data Security Platforms, Q2 2021 report, and rated Google Cloud highest in the current offering category among the providers evaluated. Read more or download the report.
      • Introducing Vertex AI: One platform, every ML tool you needVertex AI is a managed machine learning (ML) platform that allows companies to accelerate the deployment and maintenance of artificial intelligence (AI) models. Read more.
      • Transforming collaboration in Google Workspace–We’re launching smart canvas, a new product experience that delivers the next evolution of collaboration for Google Workspace. Between now and the end of the year, we’re rolling out innovations that make it easier for people to stay connected, focus their time and attention, and transform their ideas into impact. Read more.
      • Developing next-generation geothermal power–At I/O this week, we announced a first-of-its-kind, next-generation geothermal project with clean-energy startup Fervo that will soon begin adding carbon-free energy to the electric grid that serves our data centers and infrastructure throughout Nevada, including our Cloud region in Las Vegas. Read more.
      • Contributing to an environment of trust and transparency in Europe–Google Cloud was one of the first cloud providers to support and adopt the EU GDPR Cloud Code of Conduct (CoC). The CoC is a mechanism for cloud providers to demonstrate how they offer sufficient guarantees to implement appropriate technical and organizational measures as data processors under the GDPR. This week, the Belgian Data Protection Authority, based on a positive opinion by the European Data Protection Board (EDPB), approved the CoC, a product of years of constructive collaboration between the cloud computing community, the European Commission, and European data protection authorities. We are proud to say that Google Cloud Platform and Google Workspace already adhere to these provisions. Learn more.
      • Announcing Google Cloud datasets solutions–We're adding commercial, synthetic, and first-party data to our Google Cloud Public Datasets Program to help organizations increase the value of their analytics and AI initiatives, and we're making available an open source reference architecture for a more streamlined data onboarding process to the program. Read more.
      • Introducing custom samples in Cloud Code–With new custom samples in Cloud Code, developers can quickly access your enterprise’s best code samples via a versioned Git repository directly from their IDEs. Read more.
      • Retention settings for Cloud SQL–Cloud SQL now allows you to configure backup retention settings to protect against data loss. You can retain between 1 and 365 days’ worth of automated backups and between 1 and 7 days’ worth of transaction logs for point-in-time recovery. See the details here.
      • Cloud developer’s guide to Google I/O 2021Google I/O may look a little different this year, but don’t worry, you’ll still get the same first-hand look at the newest launches and projects coming from Google. Best of all, it’s free and available to all (virtually) on May 18-20. Read more.

      Week of May 10-May 14, 2021

      • APIs and Apigee power modern day due diligence–With APIs and Google Cloud's Apigee, business due diligence company DueDil revolutionized the way they harness and share their Big Information Graph (B.I.G.) with partners and customers. Get the full story.
      • Cloud CISO Perspectives: May 2021–It’s been a busy month here at Google Cloud since our inaugural CISO perspectives blog post in April. Here, VP and CISO of Google Cloud Phil Venables recaps our cloud security and industry highlights, a sneak peak of what’s ahead from Google at RSA, and more. Read more.
      • 4 new features to secure your Cloud Run services–We announced several new ways to secure Cloud Run environments to make developing and deploying containerized applications easier for developers. Read more.
      • Maximize your Cloud Run investments with new committed use discounts–We’re introducing self-service spend-based committed use discounts for Cloud Run, which let you commit for a year to spending a certain amount on Cloud Run and benefiting from a 17% discount on the amount you committed. Read more.
      • Google Cloud Armor Managed Protection Plus is now generally available–Cloud Armor, our Distributed Denial of Service (DDoS) protection and Web-Application Firewall (WAF) service on Google Cloud, leverages the same infrastructure, network, and technology that has protected Google’s internet-facing properties from some of the largest attacks ever reported. These same tools protect customers’ infrastructure from DDoS attacks, which are increasing in both magnitude and complexity every year. Deployed at the very edge of our network, Cloud Armor absorbs malicious network- and protocol-based volumetric attacks, while mitigating the OWASP Top 10 risks and maintaining the availability of protected services. Read more.
      • Announcing Document Translation for Translation API Advanced in preview–Translation is critical to many developers and localization providers, whether you’re releasing a document, a piece of software, training materials or a website in multiple languages. With Document Translation, now you can directly translate documents in 100+ languages and formats such as Docx, PPTx, XLSx, and PDF while preserving document formatting. Read more.
      • Introducing BeyondCorp Enterprise protected profiles–Protected profiles enable users to securely access corporate resources from an unmanaged device with the same threat and data protections available in BeyondCorp Enterprise–all from the Chrome Browser. Read more.
      • How reCAPTCHA Enterprise protects unemployment and COVID-19 vaccination portals–With so many people visiting government websites to learn more about the COVID-19 vaccine, make vaccine appointments, or file for unemployment, these web pages have become prime targets for bot attacks and other abusive activities. But reCAPTCHA Enterprise has helped state governments protect COVID-19 vaccine registration portals and unemployment claims portals from abusive activities. Learn more.
      • Day one with Anthos? Here are 6 ideas for how to get started–Once you have your new application platform in place, there are some things you can do to immediately get value and gain momentum. Here are six things you can do to get you started. Read more.
      • The era of the transformation cloud is here–Google Cloud’s president Rob Enslin shares how the era of the transformation cloud has seen organizations move beyond data centers to change not only where their business is done but, more importantly, how it is done. Read more.

      Week of May 3-May 7, 2021

      • Transforming hard-disk drive maintenance with predictive ML–In collaboration with Seagate, we developed a machine learning system that can forecast the probability of a recurring failing disk—a disk that fails or has experienced three or more problems in 30 days. Learn how we did it.
      • Agent Assist for Chat is now in public previewAgent Assist provides your human agents with continuous support during their calls, and now chats, by identifying the customers’ intent and providing them with real-time recommendations such as articles and FAQs as well as responses to customer messages to more effectively resolve the conversation. Read more.
      • New Google Cloud, AWS, and Azure product map–Our updated product map helps you understand similar offerings from Google Cloud, AWS, and Azure, and you can easily filter the list by product name or other common keywords. Read more or view the map.
      • Join our Google Cloud Security Talks on May 12th–We’ll share expert insights into how we’re working to be your most trusted cloud. Find the list of topics we’ll cover here.
      • Databricks is now GA on Google Cloud–Deploy or migrate Databricks Lakehouse to Google Cloud to combine the benefits of an open data cloud platform with greater analytics flexibility, unified infrastructure management, and optimized performance. Read more.
      • HPC VM image is now GA–The CentOS-based HPC VM image makes it quick and easy to create HPC-ready VMs on Google Cloud that are pre-tuned for optimal performance. Check out our documentation and quickstart guide to start creating instances using the HPC VM image today.
      • Take the 2021 State of DevOps survey–Help us shape the future of DevOps and make your voice heard by completing the 2021 State of DevOps survey before June 11, 2021. Read more or take the survey.
      • OpenTelemetry Trace 1.0 is now available–OpenTelemetry has reached a key milestone: the OpenTelemetry Tracing Specification has reached version 1.0. API and SDK release candidates are available for Java, Erlang, Python, Go, Node.js, and .Net. Additional languages will follow over the next few weeks. Read more.
      • New blueprint helps secure confidential data in AI Platform Notebooks–We’re adding to our portfolio of blueprints with the publication of our Protecting confidential data in AI Platform Notebooks blueprint guide and deployable blueprint, which can help you apply data governance and security policies that protect your AI Platform Notebooks containing confidential data. Read more.
      • The Liquibase Cloud Spanner extension is now GALiquibase, an open-source library that works with a wide variety of databases, can be used for tracking, managing, and automating database schema changes. By providing the ability to integrate databases into your CI/CD process, Liquibase helps you more fully adopt DevOps practices. The Liquibase Cloud Spanner extension allows developers to use Liquibase's open-source database library to manage and automate schema changes in Cloud Spanner. Read more.
      • Cloud computing 101: Frequently asked questions–There are a number of terms and concepts in cloud computing, and not everyone is familiar with all of them. To help, we’ve put together a list of common questions, and the meanings of a few of those acronyms. Read more.

      Week of Apr 26-Apr 30, 2021

      • Announcing the GKE Gateway controller, in Preview–GKE Gateway controller, Google Cloud’s implementation of the Gateway API, manages internal and external HTTP/S load balancing for a GKE cluster or a fleet of GKE clusters and provides multi-tenant sharing of load balancer infrastructure with centralized admin policy and control. Read more.
      • See Network Performance for Google Cloud in Performance Dashboard–The Google Cloud performance view, part of the Network Intelligence Center, provides packet loss and latency metrics for traffic on Google Cloud. It allows users to do informed planning of their deployment architecture, as well as determine in real time the answer to the most common troubleshooting question: "Is it Google or is it me?" The Google Cloud performance view is now open for all Google Cloud customers as a public preview. Check it out.
      • Optimizing data in Google Sheets allows users to create no-code apps–Format columns and tables in Google Sheets to best position your data to transform into a fully customized, successful app–no coding necessary. Read our four best Google Sheets tips.
      • Automation bots with AppSheet Automation–AppSheet recently released AppSheet Automation, infusing Google AI capabilities to AppSheet's trusted no-code app development platform. Learn step by step how to build your first automation bot on AppSheet here.
      • Google Cloud announces a new region in Israel–Our new region in Israel will make it easier for customers to serve their own users faster, more reliably and securely. Read more.
      • New multi-instance NVIDIA GPUs on GKE–We’re launching support for multi-instance GPUs in GKE (currently in Preview), which will help you drive better value from your GPU investments. Read more.
      • Partnering with NSF to advance networking innovation–We announced our partnership with the U.S. National Science Foundation (NSF), joining other industry partners and federal agencies, as part of a combined $40 million investment in academic research for Resilient and Intelligent Next-Generation (NextG) Systems, or RINGS. Read more.
      • Creating a policy contract with Configuration as Data–Configuration as Data is an emerging cloud infrastructure management paradigm that allows developers to declare the desired state of their applications and infrastructure, without specifying the precise actions or steps for how to achieve it. However, declaring a configuration is only half the battle: you also want policy that defines how a configuration is to be used. This post shows you how.
      • Google Cloud products deliver real-time data solutions–Seven-Eleven Japan built Seven Central, its new platform for digital transformation, on Google Cloud. Powered by BigQuery, Cloud Spanner, and Apigee API management, Seven Central presents easy to understand data, ultimately allowing for quickly informed decisions. Read their story here.

      Week of Apr 19-Apr 23, 2021

      • Extreme PD is now GA–On April 20th, Google Cloud’s Persistent Disk launched general availability of Extreme PD, a high performance block storage volume with provisioned IOPS and up to 2.2 GB/s of throughput. Learn more.

      • Research: How data analytics and intelligence tools to play a key role post-COVID-19–A recent Google-commissioned study by IDG highlighted the role of data analytics and intelligent solutions when it comes to helping businesses separate from their competition. The survey of 2,000 IT leaders across the globe reinforced the notion that the ability to derive insights from data will go a long way towards determining which companies win in this new era. Learn more or download the study.

      • Introducing PHP on Cloud Functions–We’re bringing support for PHP, a popular general-purpose programming language, to Cloud Functions. With the Functions Framework for PHP, you can write idiomatic PHP functions to build business-critical applications and integration layers. And with Cloud Functions for PHP, now available in Preview, you can deploy functions in a fully managed PHP 7.4 environment, complete with access to resources in a private VPC network. Learn more.

      • Delivering our 2020 CCAG pooled audit–As our customers increased their use of cloud services to meet the demands of teleworking and aid in COVID-19 recovery, we’ve worked hard to meet our commitment to being the industry’s most trusted cloud, despite the global pandemic. We’re proud to announce that Google Cloud completed an annual pooled audit with the CCAG in a completely remote setting, and were the only cloud service provider to do so in 2020. Learn more.

      • Anthos 1.7 now available–We recently released Anthos 1.7, our run-anywhere Kubernetes platform that’s connected to Google Cloud, delivering an array of capabilities that make multicloud more accessible and sustainable. Learn more.

      • New Redis Enterprise for Anthos and GKE–We’re making Redis Enterprise for Anthos and Google Kubernetes Engine (GKE) available in the Google Cloud Marketplace in private preview. Learn more.

      • Updates to Google Meet–We introduced a refreshed user interface (UI), enhanced reliability features powered by the latest Google AI, and tools that make meetings more engaging—even fun—for everyone involved. Learn more.

      • DocAI solutions now generally availableDocument (Doc) AI platformLending DocAI and Procurement DocAI, built on decades of AI innovation at Google, bring powerful and useful solutions across lending, insurance, government and other industries. Learn more.

      • Four consecutive years of 100% renewable energy–In 2020, Google again matched 100 percent of its global electricity use with purchases of renewable energy. All told, we’ve signed agreements to buy power from more than 50 renewable energy projects, with a combined capacity of 5.5 gigawatts–about the same as a million solar rooftops. Learn more.

      • Announcing the Google Cloud region picker–The Google Cloud region picker lets you assess key inputs like price, latency to your end users, and carbon footprint to help you choose which Google Cloud region to run on. Learn more.

      • Google Cloud launches new security solution WAAP–WebApp and API Protection (WAAP) combines Google Cloud Armor, Apigee, and reCAPTCHA Enterprise to deliver improved threat protection, consolidated visibility, and greater operational efficiencies across clouds and on-premises environments. Learn more about WAAP here.
      • New in no-code–As discussed in our recent article, no-code hackathons are trending among innovative organizations. Since then, we've outlined how you can host one yourself specifically designed for your unique business innovation outcomes. Learn how here.
      • Google Cloud Referral Program now available—Now you can share the power of Google Cloud and earn product credit for every new paying customer you refer. Once you join the program, you’ll get a unique referral link that you can share with friends, clients, or others. Whenever someone signs up with your link, they’ll get a $350 product credit—that’s $50 more than the standard trial credit. When they become a paying customer, we’ll reward you with a $100 product credit in your Google Cloud account. Available in the United States, Canada, Brazil, and Japan. Apply for the Google Cloud Referral Program.

      Week of Apr 12-Apr 16, 2021

      • Announcing the Data Cloud Summit, May 26, 2021–At this half-day event, you’ll learn how leading companies like PayPal, Workday, Equifax, Zebra Technologies, Commonwealth Care Alliance and many others are driving competitive differentiation using Google Cloud technologies to build their data clouds and transform data into value that drives innovation. Learn more and register at no cost.
      • Announcing the Financial Services Summit, May 27, 2021–In this 2 hour event, you’ll learn how Google Cloud is helping financial institutions including PayPal, Global Payments, HSBC, Credit Suisse, and more unlock new possibilities and accelerate business through innovation and better customer experiences. Learn more and register for free: Global & EMEA.
      • How Google Cloud is enabling vaccine equity–In our latest update, we share more on how we’re working with US state governments to help produce equitable vaccination strategies at scale. Learn more.
      • The new Google Cloud region in Warsaw is open–The Google Cloud region in Warsaw is now ready for business, opening doors for organizations in Central and Eastern Europe. Learn more.
      • AppSheet Automation is now GA–Google Cloud’s AppSheet launches general availability of AppSheet Automation, a unified development experience for citizen and professional developers alike to build custom applications with automated processes, all without coding. Learn how companies and employees are reclaiming their time and talent with AppSheet Automation here.
      • Introducing SAP Integration with Cloud Data Fusion–Google Cloud native data integration platform Cloud Data Fusion now offers the capability to seamlessly get data out of SAP Business Suite, SAP ERP and S/4HANA. Learn more.

      Week of Apr 5-Apr 9, 2021

      • New Certificate Authority Service (CAS) whitepaper–“How to deploy a secure and reliable public key infrastructure with Google Cloud Certificate Authority Service” (written by Mark Cooper of PKI Solutions and Anoosh Saboori of Google Cloud) covers security and architectural recommendations for the use of the Google Cloud CAS by organizations, and describes critical concepts for securing and deploying a PKI based on CAS. Learn more or read the whitepaper.
      • Active Assist’s new feature, predictive autoscaling, helps improve response times for your applications–When you enable predictive autoscaling, Compute Engine forecasts future load based on your Managed Instance Group’s (MIG) history and scales it out in advance of predicted load, so that new instances are ready to serve when the load arrives. Without predictive autoscaling, an autoscaler can only scale a group reactively, based on observed changes in load in real time. With predictive autoscaling enabled, the autoscaler works with real-time data as well as with historical data to cover both the current and forecasted load. That makes predictive autoscaling ideal for those apps with long initialization times and whose workloads vary predictably with daily or weekly cycles. For more information, see How predictive autoscaling works or check if predictive autoscaling is suitable for your workload, and to learn more about other intelligent features, check out Active Assist.
      • Introducing Dataprep BigQuery pushdown–BigQuery pushdown gives you the flexibility to run jobs using either BigQuery or Dataflow. If you select BigQuery, then Dataprep can automatically determine if data pipelines can be partially or fully translated in a BigQuery SQL statement. Any portions of the pipeline that cannot be run in BigQuery are executed in Dataflow. Utilizing the power of BigQuery results in highly efficient data transformations, especially for manipulations such as filters, joins, unions, and aggregations. This leads to better performance, optimized costs, and increased security with IAM and OAuth support. Learn more.
      • Announcing the Google Cloud Retail & Consumer Goods Summit–The Google Cloud Retail & Consumer Goods Summit brings together technology and business insights, the key ingredients for any transformation. Whether you're responsible for IT, data analytics, supply chains, or marketing, please join! Building connections and sharing perspectives cross-functionally is important to reimagining yourself, your organization, or the world. Learn more or register for free.
      • New IDC whitepaper assesses multicloud as a risk mitigation strategy–To better understand the benefits and challenges associated with a multicloud approach, we supported IDC’s new whitepaper that investigates how multicloud can help regulated organizations mitigate the risks of using a single cloud vendor. The whitepaper looks at different approaches to multi-vendor and hybrid clouds taken by European organizations and how these strategies can help organizations address concentration risk and vendor-lock in, improve their compliance posture, and demonstrate an exit strategy. Learn more or download the paper.
      • Introducing request priorities for Cloud Spanner APIs–You can now specify request priorities for some Cloud Spanner APIs. By assigning a HIGH, MEDIUM, or LOW priority to a specific request, you can now convey the relative importance of workloads, to better align resource usage with performance objectives. Learn more.
      • How we’re working with governments on climate goals–Google Sustainability Officer Kate Brandt shares more on how we’re partnering with governments around the world to provide our technology and insights to drive progress in sustainability efforts. Learn more.

      Week of Mar 29-Apr 2, 2021

      • Why Google Cloud is the ideal platform for Block.one and other DLT companies–Late last year, Google Cloud joined the EOS community, a leading open-source platform for blockchain innovation and performance, and is taking steps to support the EOS Public Blockchain by becoming a block producer (BP). At the time, we outlined how our planned participation underscores the importance of blockchain to the future of business, government, and society. We're sharing more on why Google Cloud is uniquely positioned to be an excellent partner for Block.one and other distributed ledger technology (DLT) companies. Learn more.
      • New whitepaper: Scaling certificate management with Certificate Authority Service–As Google Cloud’s Certificate Authority Service (CAS) approaches general availability, we want to help customers understand the service better. Customers have asked us how CAS fits into our larger security story and how CAS works for various use cases. Our new white paper answers these questions and more. Learn more and download the paper.
      • Build a consistent approach for API consumers–Learn the differences between REST and GraphQL, as well as how to apply REST-based practices to GraphQL. No matter the approach, discover how to manage and treat both options as API products here.

      • Apigee X makes it simple to apply Cloud CDN to APIs–With Apigee X and Cloud CDN, organizations can expand their API programs' global reach. Learn how to deploy APIs across 24 regions and 73 zones here.

      • Enabling data migration with Transfer Appliances in APAC—We’re announcing the general availability of Transfer Appliances TA40/TA300 in Singapore. Customers are looking for fast, secure and easy to use options to migrate their workloads to Google Cloud and we are addressing their needs with Transfer Appliances globally in the US, EU and APAC. Learn more about Transfer Appliances TA40 and TA300.

      • Windows Authentication is now supported on Cloud SQL for SQL Server in public preview—We’ve launched seamless integration with Google Cloud’s Managed Service for Microsoft Active Directory (AD). This capability is a critical requirement to simplify identity management and streamline the migration of existing SQL Server workloads that rely on AD for access control. Learn more or get started.

      • Using Cloud AI to whip up new treats with Mars Maltesers—Maltesers, a popular British candy made by Mars, teamed up with our own AI baker and ML engineer extraordinaire, Sara Robinson, to create a brand new dessert recipe with Google Cloud AI. Find out what happened (recipe included).

      • Simplifying data lake management with Dataproc Metastore, now GADataproc Metastore, a fully managed, serverless technical metadata repository based on the Apache Hive metastore, is now generally available. Enterprises building and migrating open source data lakes to Google Cloud now have a central and persistent metastore for their open source data analytics frameworks. Learn more.

      • Introducing the Echo subsea cable—We announced our investment in Echo, the first-ever cable to directly connect the U.S. to Singapore with direct fiber pairs over an express route. Echo will run from Eureka, California to Singapore, with a stop-over in Guam, and plans to also land in Indonesia. Additional landings are possible in the future. Learn more.

      Week of Mar 22-Mar 26, 2021

      • 10 new videos bring Google Cloud to life—The Google Cloud Tech YouTube channel’s latest video series explains cloud tools for technical practitioners in about 5 minutes each. Learn more.
      • BigQuery named a Leader in the 2021 Forrester Wave: Cloud Data Warehouse, Q1 2021 report—Forrester gave BigQuery a score of 5 out of 5 across 19 different criteria. Learn more in our blog post, or download the report.
      • Charting the future of custom compute at Google—To meet users’ performance needs at low power, we’re doubling down on custom chips that use System on a Chip (SoC) designs. Learn more.
      • Introducing Network Connectivity Center—We announced Network Connectivity Center, which provides a single management experience to easily create, connect, and manage heterogeneous on-prem and cloud networks leveraging Google’s global infrastructure. Network Connectivity Center serves as a vantage point to seamlessly connect VPNs, partner and dedicated interconnects, as well as third-party routers and Software-Defined WANs, helping you optimize connectivity, reduce operational burden and lower costs—wherever your applications or users may be. Learn more.
      • Making it easier to get Compute Engine resources for batch processing—We announced a new method of obtaining Compute Engine instances for batch processing that accounts for availability of resources in zones of a region. Now available in preview for regional managed instance groups, you can do this simply by specifying the ANY value in the API. Learn more.
      • Next-gen virtual automotive showrooms are here, thanks to Google Cloud, Unreal Engine, and NVIDIA—We teamed up with Unreal Engine, the open and advanced real-time 3D creation game engine, and NVIDIA, inventor of the GPU, to launch new virtual showroom experiences for automakers. Taking advantage of the NVIDIA RTX platform on Google Cloud, these showrooms provide interactive 3D experiences, photorealistic materials and environments, and up to 4K cloud streaming on mobile and connected devices. Today, in collaboration with MHP, the Porsche IT consulting firm, and MONKEYWAY, a real-time 3D streaming solution provider, you can see our first virtual showroom, the Pagani Immersive Experience Platform. Learn more.
      • Troubleshoot network connectivity with Dynamic Verification (public preview)—You can now check packet loss rate and one-way network latency between two VMs on GCP. This capability is an addition to existing Network Intelligence Center Connectivity Tests which verify reachability by analyzing network configuration in your VPCs. See more in our documentation.
      • Helping U.S. states get the COVID-19 vaccine to more people—In February, we announced our Intelligent Vaccine Impact solution (IVIs) to help communities rise to the challenge of getting vaccines to more people quickly and effectively. Many states have deployed IVIs, and have found it able to meet demand and easily integrate with their existing technology infrastructures. Google Cloud is proud to partner with a number of states across the U.S., including Arizona, the Commonwealth of Massachusetts, North Carolina, Oregon, and the Commonwealth of Virginia to support vaccination efforts at scale. Learn more.

      Week of Mar 15-Mar 19, 2021

      • A2 VMs now GA: The largest GPU cloud instances with NVIDIA A100 GPUs—We’re announcing the general availability of A2 VMs based on the NVIDIA Ampere A100 Tensor Core GPUs in Compute Engine. This means customers around the world can now run their NVIDIA CUDA-enabled machine learning (ML) and high performance computing (HPC) scale-out and scale-up workloads more efficiently and at a lower cost. Learn more.
      • Earn the new Google Kubernetes Engine skill badge for free—We’ve added a new skill badge this month, Optimize Costs for Google Kubernetes Engine (GKE), which you can earn for free when you sign up for the Kubernetes track of the skills challenge. The skills challenge provides 30 days free access to Google Cloud labs and gives you the opportunity to earn skill badges to showcase different cloud competencies to employers. Learn more.
      • Now available: carbon free energy percentages for our Google Cloud regions—Google first achieved carbon neutrality in 2007, and since 2017 we’ve purchased enough solar and wind energy to match 100% of our global electricity consumption. Now we’re building on that progress to target a new sustainability goal: running our business on carbon-free energy 24/7, everywhere, by 2030. Beginning this week, we’re sharing data about how we are performing against that objective so our customers can select Google Cloud regions based on the carbon-free energy supplying them. Learn more.
      • Increasing bandwidth to C2 and N2 VMs—We announced the public preview of 100, 75, and 50 Gbps high-bandwidth network configurations for General Purpose N2 and Compute Optimized C2 Compute Engine VM families as part of continuous efforts to optimize our Andromeda host networking stack. This means we can now offer higher-bandwidth options on existing VM families when using the Google Virtual NIC (gVNIC). These VMs were previously limited to 32 Gbps. Learn more.
      • New research on how COVID-19 changed the nature of IT—To learn more about the impact of COVID-19 and the resulting implications to IT, Google commissioned a study by IDG to better understand how organizations are shifting their priorities in the wake of the pandemic. Learn more and download the report.

      • New in API security—Google Cloud Apigee API management platform's latest release, Apigee X, works with Cloud Armor to protect your APIs with advanced security technology including DDoS protection, geo-fencing, OAuth, and API keys. Learn more about our integrated security enhancements here.

      • Troubleshoot errors more quickly with Cloud Logging—The Logs Explorer now automatically breaks down your log results by severity, making it easy to spot spikes in errors at specific times. Learn more about our new histogram functionality here.

      Week of Mar 8-Mar 12, 2021

      • Introducing #AskGoogleCloud on Twitter and YouTube—Our first segment on March 12th features Developer Advocates Stephanie Wong, Martin Omander and James Ward to answer questions on the best workloads for serverless, the differences between “serverless” and “cloud native,” how to accurately estimate costs for using Cloud Run, and much more. Learn more.
      • Learn about the value of no-code hackathons—Google Cloud’s no-code application development platform, AppSheet, helps to facilitate hackathons for “non-technical” employees with no coding necessary to compete. Learn about Globe Telecom’s no-code hackathon as well as their winning AppSheet app here.
      • Introducing Cloud Code Secret Manager Integration—Secret Manager provides a central place and single source of truth to manage, access, and audit secrets across Google Cloud. Integrating Cloud Code with Secret Manager brings the powerful capabilities of both these tools together so you can create and manage your secrets right from within your preferred IDE, whether that be VS Code, IntelliJ, or Cloud Shell Editor. Learn more.
      • Flexible instance configurations in Cloud SQL—Cloud SQL for MySQL now supports flexible instance configurations which offer you the extra freedom to configure your instance with the specific number of vCPUs and GB of RAM that fits your workload. To set up a new instance with a flexible instance configuration, see our documentation here.
      • The Cloud Healthcare Consent Management API is now generally available—The Healthcare Consent Management API is now GA, giving customers the ability to greatly scale the management of consents to meet increasing need, particularly amidst the emerging task of managing health data for new care and research scenarios. Learn more.

      Week of Mar 1-Mar 5, 2021

      • Cloud Run is now available in all Google Cloud regions. Learn more.
      • Introducing Apache Spark Structured Streaming connector for Pub/Sub Lite—We’re announcing the release of an open source connector to read streams of messages from Pub/Sub Lite into Apache Spark.The connector works in all Apache Spark 2.4.X distributions, including Dataproc, Databricks, or manual Spark installations. Learn more.
      • Google Cloud Next ‘21 is October 12-14, 2021—Join us and learn how the most successful companies have transformed their businesses with Google Cloud. Sign-up at g.co/cloudnext for updates. Learn more.
      • Hierarchical firewall policies now GA—Hierarchical firewalls provide a means to enforce firewall rules at the organization and folder levels in the GCP Resource Hierarchy. This allows security administrators at different levels in the hierarchy to define and deploy consistent firewall rules across a number of projects so they're applied to all VMs in currently existing and yet-to-be-created projects. Learn more.
      • Announcing the Google Cloud Born-Digital Summit—Over this half-day event, we’ll highlight proven best-practice approaches to data, architecture, diversity & inclusion, and growth with Google Cloud solutions. Learn more and register for free.
      • Google Cloud products in 4 words or less (2021 edition)—Our popular “4 words or less Google Cloud developer’s cheat sheet” is back and updated for 2021. Learn more.
      • Gartner names Google a leader in its 2021 Magic Quadrant for Cloud AI Developer Services report—We believe this recognition is based on Gartner’s evaluation of Google Cloud’s language, vision, conversational, and structured data services and solutions for developers. Learn more.
      • Announcing the Risk Protection Program—The Risk Protection Program offers customers peace of mind through the technology to secure their data, the tools to monitor the security of that data, and an industry-first cyber policy offered by leading insurers. Learn more.
      • Building the future of work—We’re introducing new innovations in Google Workspace to help people collaborate and find more time and focus, wherever and however they work. Learn more.

      • Assured Controls and expanded Data Regions—We’ve added new information governance features in Google Workspace to help customers control their data based on their business goals. Learn more.

      Week of Feb 22-Feb 26, 2021

      • 21 Google Cloud tools explained in 2 minutes—Need a quick overview of Google Cloud core technologies? Quickly learn these 21 Google Cloud products—each explained in under two minutes. Learn more.

      • BigQuery materialized views now GA—Materialized views (MV’s) are precomputed views that periodically cache results of a query to provide customers increased performance and efficiency. Learn more.

      • New in BigQuery BI Engine—We’re extending BigQuery BI Engine to work with any BI or custom dashboarding applications that require sub-second query response times. In this preview, BI Engine will work seamlessly with Looker and other popular BI tools such as Tableau and Power BI without requiring any change to the BI tools. Learn more.

      • Dataproc now supports Shielded VMs—All Dataproc clusters created using Debian 10 or Ubuntu 18.04 operating systems now use Shielded VMs by default and customers can provide their own configurations for secure boot, vTPM, and Integrity Monitoring. This feature is just one of the many ways customers that have migrated their Hadoop and Spark clusters to GCP experience continued improvements to their security postures without any additional cost.

      • New Cloud Security Podcast by Google—Our new podcast brings you stories and insights on security in the cloud, delivering security from the cloud, and, of course, on what we’re doing at Google Cloud to help keep customer data safe and workloads secure. Learn more.

      • New in Conversational AI and Apigee technology—Australian retailer Woolworths provides seamless customer experiences with their virtual agent, Olive. Apigee API Management and Dialogflow technology allows customers to talk to Olive through voice and chat. Learn more.

      • Introducing GKE Autopilot—GKE already offers an industry-leading level of automation that makes setting up and operating a Kubernetes cluster easier and more cost effective than do-it-yourself and other managed offerings. Autopilot represents a significant leap forward. In addition to the fully managed control plane that GKE has always provided, using the Autopilot mode of operation automatically applies industry best practices and can eliminate all node management operations, maximizing your cluster efficiency and helping to provide a stronger security posture. Learn more.

      • Partnering with Intel to accelerate cloud-native 5G—As we continue to grow cloud-native services for the telecommunications industry, we’re excited to announce a collaboration with Intel to develop reference architectures and integrated solutions for communications service providers to accelerate their deployment of 5G and edge network solutions. Learn more.

      • Veeam Backup for Google Cloud now available—Veeam Backup for Google Cloud automates Google-native snapshots to securely protect VMs across projects and regions with ultra-low RPOs and RTOs, and store backups in Google Object Storage to enhance data protection while ensuring lower costs for long-term retention.

      • Migrate for Anthos 1.6 GA—With Migrate for Anthos, customers and partners can automatically migrate and modernize traditional application workloads running in VMs into containers running on Anthos or GKE. Included in this new release: 

        • In-place modernization for Anthos on AWS (Public Preview) to help customers accelerate on-boarding to Anthos AWS while leveraging their existing investment in AWS data sources, projects, VPCs, and IAM controls.

        • Additional Docker registries and artifacts repositories support (GA) including AWS ECR, basic-auth docker registries, and AWS S3 storage to provide further flexibility for customers using Anthos Anywhere (on-prem, AWS, etc). 

        • HTTPS Proxy support (GA) to enable M4A functionality (access to external image repos and other services) where a proxy is used to control external access.

      Week of Feb 15-Feb 19, 2021

      • Introducing Cloud Domains in preview—Cloud Domains simplify domain registration and management within Google Cloud, improve the custom domain experience for developers, increase security, and support stronger integrations around DNS and SSL. Learn more.

      • Announcing Databricks on Google Cloud—Our partnership with Databricks enables customers to accelerate Databricks implementations by simplifying their data access, by jointly giving them powerful ways to analyze their data, and by leveraging our combined AI and ML capabilities to impact business outcomes. Learn more.

      • Service Directory is GA—As the number and diversity of services grows, it becomes increasingly challenging to maintain an inventory of all of the services across an organization. Last year, we launched Service Directory to help simplify the problem of service management. Today, it’s generally available. Learn more.

      Week of Feb 8-Feb 12, 2021

      • Introducing Bare Metal Solution for SAP workloads—We’ve expanded our Bare Metal Solution—dedicated, single-tenant systems designed specifically to run workloads that are too large or otherwise unsuitable for standard, virtualized environments—to include SAP-certified hardware options, giving SAP customers great options for modernizing their biggest and most challenging workloads. Learn more.

      • 9TB SSDs bring ultimate IOPS/$ to Compute Engine VMs—You can now attach 6TB and 9TB Local SSD to second-generation general-purpose N2 Compute Engine VMs, for great IOPS per dollar. Learn more.

      • Supporting the Python ecosystem—As part of our longstanding support for the Python ecosystem, we are happy to increase our support for the Python Software Foundation, the non-profit behind the Python programming language, ecosystem and community. Learn more

      • Migrate to regional backend services for Network Load Balancing—We now support backend services with Network Load Balancing—a significant enhancement over the prior approach, target pools, providing a common unified data model for all our load-balancing family members and accelerating the delivery of exciting features on Network Load Balancing. Learn more.

      Week of Feb 1-Feb 4, 2021

      • Apigee launches Apigee X—Apigee celebrates its 10 year anniversary with Apigee X, a new release of the Apigee API management platform. Apigee X harnesses the best of Google technologies to accelerate and globalize your API-powered digital initiatives. Learn more about Apigee X and digital excellence here.
      • Celebrating the success of Black founders with Google Cloud during Black History Month—February is Black History Month, a time for us to come together to celebrate and remember the important people and history of the African heritage. Over the next four weeks, we will highlight four Black-led startups and how they use Google Cloud to grow their businesses. Our first feature highlights TQIntelligence and its founder, Yared.

      Week of Jan 25-Jan 29, 2021

      • BeyondCorp Enterprise now generally available—BeyondCorp Enterprise is a zero trust solution, built on Google’s global network, which provides customers with simple and secure access to applications and cloud resources and offers integrated threat and data protection. To learn more, read the blog post, visit our product homepage, and register for our upcoming webinar.

      Week of Jan 18-Jan 22, 2021

      • Cloud Operations Sandbox now available—Cloud Operations Sandbox is an open-source tool that helps you learn SRE practices from Google and apply them on cloud services using Google Cloud’s operations suite (formerly Stackdriver), with everything you need to get started in one click. You can read our blog post, or get started by visiting cloud-ops-sandbox.dev, exploring the project repo, and following along in the user guide

      • New data security strategy whitepaper—Our new whitepaper shares our best practices for how to deploy a modern and effective data security program in the cloud. Read the blog post or download the paper.   

      • WebSockets, HTTP/2 and gRPC bidirectional streams come to Cloud Run—With these capabilities, you can deploy new kinds of applications to Cloud Run that were not previously supported, while taking advantage of serverless infrastructure. These features are now available in public preview for all Cloud Run locations. Read the blog post or check out the WebSockets demo app or the sample h2c server app.

      • New tutorial: Build a no-code workout app in 5 steps—Looking to crush your new year’s resolutions? Using AppSheet, Google Cloud’s no-code app development platform, you can build a custom fitness app that can do things like record your sets, reps and weights, log your workouts, and show you how you’re progressing. Learn how.

      Week of Jan 11-Jan 15, 2021

      • State of API Economy 2021 Report now available—Google Cloud details the changing role of APIs in 2020 amidst the COVID-19 pandemic, informed by a comprehensive study of Apigee API usage behavior across industry, geography, enterprise size, and more. Discover these 2020 trends along with a projection of what to expect from APIs in 2021. Read our blog post here or download and read the report here.
      • New in the state of no-code—Google Cloud's AppSheet looks back at the key no-code application development themes of 2020. AppSheet contends the rising number of citizen developer app creators will ultimately change the state of no-code in 2021. Read more here.

      Week of Jan 4-Jan 8, 2021

      • Last year's most popular API posts—In an arduous year, thoughtful API design and strategy is critical to empowering developers and companies to use technology for global good. Google Cloud looks back at the must-read API posts in 2020. Read it here.

      Week of Dec 21-Dec 25, 2020

      Week of Dec 14-Dec 18, 2020

      • Memorystore for Redis enables TLS encryption support (Preview)—With this release, you can now use Memorystore for applications requiring sensitive data to be encrypted between the client and the Memorystore instance. Read more here.
      • Monitoring Query Language (MQL) for Cloud Monitoring is now generally available—Monitoring Query language provides developers and operators on IT and development teams powerful metric querying, analysis, charting, and alerting capabilities. This functionality is needed for Monitoring use cases that include troubleshooting outages, root cause analysis, custom SLI / SLO creation, reporting and analytics, complex alert logic, and more. Learn more.

      Week of Dec 7-Dec 11, 2020

      • Memorystore for Redis now supports Redis AUTH—With this release you can now use OSS Redis AUTH feature with Memorystore for Redis instances. Read more here.
      • New in serverless computing—Google Cloud API Gateway and its service-first approach to developing serverless APIs helps organizations accelerate innovation by eliminating scalability and security bottlenecks for their APIs. Discover more benefits here.
      • Environmental Dynamics, Inc. makes a big move to no-code—The environmental consulting company EDI built and deployed 35+ business apps with no coding skills necessary with Google Cloud’s AppSheet. This no-code effort not only empowered field workers, but also saved employees over 2,550 hours a year. Get the full story here.
      • Introducing Google Workspace for Government—Google Workspace for Government is an offering that brings the best of Google Cloud’s collaboration and communication tools to the government with pricing that meets the needs of the public sector. Whether it’s powering social care visits, employment support, or virtual courts, Google Workspace helps governments meet the unique challenges they face as they work to provide better services in an increasingly virtual world. Learn more.

      Week of Nov 30-Dec 4, 2020

      • Google enters agreement to acquire Actifio—Actifio, a leader in backup and disaster recovery (DR), offers customers the opportunity to protect virtual copies of data in their native format, manage these copies throughout their entire lifecycle, and use these copies for scenarios like development and test. This planned acquisition further demonstrates Google Cloud’s commitment to helping enterprises protect workloads on-premises and in the cloud. Learn more.
      • Traffic Director can now send traffic to services and gateways hosted outside of Google Cloud—Traffic Director support for Hybrid Connectivity Network Endpoint Groups (NEGs), now generally available, enables services in your VPC network to interoperate more seamlessly with services in other environments. It also enables you to build advanced solutions based on Google Cloud's portfolio of networking products, such as Cloud Armor protection for your private on-prem services. Learn more.
      • Google Cloud launches the Healthcare Interoperability Readiness Program—This program, powered by APIs and Google Cloud’s Apigee, helps patients, doctors, researchers, and healthcare technologists alike by making patient data and healthcare data more accessible and secure. Learn more here.
      • Container Threat Detection in Security Command Center—We announced the general availability of Container Threat Detection, a built-in service in Security Command Center. This release includes multiple detection capabilities to help you monitor and secure your container deployments in Google Cloud. Read more here.
      • Anthos on bare metal now GA—Anthos on bare metal opens up new possibilities for how you run your workloads, and where. You can run Anthos on your existing virtualized infrastructure, or eliminate the dependency on a hypervisor layer to modernize applications while reducing costs. Learn more.

      Week of Nov 23-27, 2020

      • Tuning control support in Cloud SQL for MySQL—We’ve made all 80 flags that were previously in preview now generally available (GA), empowering you with the controls you need to optimize your databases. See the full list here.
      • New in BigQuery ML—We announced the general availability of boosted trees using XGBoost, deep neural networks (DNNs) using TensorFlow, and model export for online prediction. Learn more.
      • New AI/ML in retail report—We recently commissioned a survey of global retail executives to better understand which AI/ML use cases across the retail value chain drive the highest value and returns in retail, and what retailers need to keep in mind when going after these opportunities. Learn more  or read the report.

      Week of Nov 16-20, 2020

      • New whitepaper on how AI helps the patent industry—Our new paper outlines a methodology to train a BERT (bidirectional encoder representation from transformers) model on over 100 million patent publications from the U.S. and other countries using open-source tooling. Learn more or read the whitepaper.
      • Google Cloud support for .NET 5.0—Learn more about our support of .NET 5.0, as well as how to deploy it to Cloud Run.
      • .NET Core 3.1 now on Cloud Functions—With this integration you can write cloud functions using your favorite .NET Core 3.1 runtime with our Functions Framework for .NET for an idiomatic developer experience. Learn more.
      • Filestore Backups in preview—We announced the availability of the Filestore Backups preview in all regions, making it easier to migrate your business continuity, disaster recovery and backup strategy for your file systems in Google Cloud. Learn more.
      • Introducing Voucher, a service to help secure the container supply chain—Developed by the Software Supply Chain Security team at Shopify to work with Google Cloud tools, Voucher evaluates container images created by CI/CD pipelines and signs those images if they meet certain predefined security criteria. Binary Authorization then validates these signatures at deploy time, ensuring that only explicitly authorized code that meets your organizational policy and compliance requirements can be deployed to production. Learn more.
      • 10 most watched from Google Cloud Next ‘20: OnAir—Take a stroll through the 10 sessions that were most popular from Next OnAir, covering everything from data analytics to cloud migration to no-code development. Read the blog.
      • Artifact Registry is now GA—With support for container images, Maven, npm packages, and additional formats coming soon, Artifact Registry helps your organization benefit from scale, security, and standardization across your software supply chain. Read the blog.

      Week of Nov 9-13, 2020

      • Introducing the Anthos Developer Sandbox—The Anthos Developer Sandbox gives you an easy way to learn to develop on Anthos at no cost, available to anyone with a Google account. Read the blog.
      • Database Migration Service now available in preview—Database Migration Service (DMS) makes migrations to Cloud SQL simple and reliable. DMS supports migrations of self-hosted MySQL databases—either on-premises or in the cloud, as well as managed databases from other clouds—to Cloud SQL for MySQL. Support for PostgreSQL is currently available for limited customers in preview, with SQL Server coming soon. Learn more.
      • Troubleshoot deployments or production issues more quickly with new logs tailing—We’ve added support for a new API to tail logs with low latency. Using gcloud, it allows you the convenience of tail -f with the powerful query language and centralized logging solution of Cloud Logging. Learn more about this preview feature.
      • Regionalized log storage now available in 5 new regions in preview—You can now select where your logs are stored from one of five regions in addition to global—asia-east1, europe-west1, us-central1, us-east1, and us-west1. When you create a logs bucket, you can set the region in which you want to store your logs data. Get started with this guide.

      Week of Nov 2-6, 2020

      • Cloud SQL adds support for PostgreSQL 13—Shortly after its community GA, Cloud SQL has added support for PostgreSQL 13. You get access to the latest features of PostgreSQL while Cloud SQL handles the heavy operational lifting, so your team can focus on accelerating application delivery. Read more here.
      • Apigee creates value for businesses running on SAP—Google Cloud’s API Management platform Apigee is optimized for data insights and data monetization, helping businesses running on SAP innovate faster without fear of SAP-specific challenges to modernization. Read more here.
      • Document AI platform is live—The new Document AI (DocAI) platform, a unified console for document processing, is now available in preview. You can quickly access all parsers, tools and solutions (e.g. Lending DocAI, Procurement DocAI) with a unified API, enabling an end-to-end document solution from evaluation to deployment. Read the full story here or check it out in your Google Cloudconsole.
      • Accelerating data migration with Transfer Appliances TA40 and TA300—We’re announcing the general availability of new Transfer Appliances. Customers are looking for fast, secure and easy to use options to migrate their workloads to Google Cloud and we are addressing their needs with next generation Transfer Appliances. Learn more about Transfer Appliances TA40 and TA300.

      Week of Oct 26-30, 2020

      • B.H., Inc. accelerates digital transformation—The Utah based contracting and construction company BHI eliminated IT backlog when non technical employees were empowered to build equipment inspection, productivity, and other custom apps by choosing Google Workspace and the no-code app development platform, AppSheet. Read the full story here.
      • Globe Telecom embraces no-code development—Google Cloud’s AppSheet empowers Globe Telecom employees to do more innovating with less code. The global communications company kickstarted their no-code journey by combining the power of AppSheet with a unique adoption strategy. As a result, AppSheet helped Globe Telecom employees build 59 business apps in just 8 weeks. Get the full story.
      • Cloud Logging now allows you to control access to logs via Log Views—Building on the control offered via Log Buckets (blog post), you can now configure who has access to logs based on the source project, resource type, or log name, all using standard IAM controls. Logs views, currently in Preview, can help you build a system using the principle of least privilege, limiting sensitive logs to only users who need this information. Learn more about Log Views.
      • Document AI is HIPAA compliantDocument AI now enables HIPAA compliance. Now Healthcare and Life Science customers such as health care providers, health plans, and life science organizations can unlock insights by quickly extracting structured data from medical documents while safeguarding individuals’ protected health information (PHI). Learn more about Google Cloud’s nearly 100 products that support HIPAA-compliance.

      Week of Oct 19-23, 2020

      • Improved security and governance in Cloud SQL for PostgreSQL—Cloud SQL for PostgreSQL now integrates with Cloud IAM (preview) to provide simplified and consistent authentication and authorization. Cloud SQL has also enabled PostgreSQL Audit Extension (preview) for more granular audit logging. Read the blog.
      • Announcing the AI in Financial Crime Compliance webinar—Our executive digital forum will feature industry executives, academics, and former regulators who will discuss how AI is transforming financial crime compliance on November 17. Register now.
      • Transforming retail with AI/ML—New research provides insights on high value AI/ML use cases for food, drug, mass merchant and speciality retail that can drive significant value and build resilience for your business. Learn what the top use cases are for your sub-segment and read real world success stories. Download the ebook here and view this companion webinar which also features insights from Zulily.
      • New release of Migrate for Anthos—We’re introducing two important new capabilities in the 1.5 release of Migrate for Anthos, Google Cloud's solution to easily migrate and modernize applications currently running on VMs so that they instead run on containers in Google Kubernetes Engine or Anthos. The first is GA support for modernizing IIS apps running on Windows Server VMs. The second is a new utility that helps you identify which VMs in your existing environment are the best targets for modernization to containers. Start migrating or check out the assessment tool documentation (Linux | Windows).
      • New Compute Engine autoscaler controls—New scale-in controls in Compute Engine let you limit the VM deletion rate by preventing the autoscaler from reducing a MIG's size by more VM instances than your workload can tolerate to lose. Read the blog.
      • Lending DocAI in previewLending DocAI is a specialized solution in our Document AI portfolio for the mortgage industry that processes borrowers’ income and asset documents to speed-up loan applications. Read the blog, or check out the product demo.

      Week of Oct 12-16, 2020

      • New maintenance controls for Cloud SQL—Cloud SQL now offers maintenance deny period controls, which allow you to prevent automatic maintenance from occurring during a 90-day time period. Read the blog.
      • Trends in volumetric DDoS attacks—This week we published a deep dive into DDoS threats, detailing the trends we’re seeing and giving you a closer look at how we prepare for multi-terabit attacks so your sites stay up and running. Read the blog.
      • New in BigQuery—We shared a number of updates this week, including new SQL capabilities, more granular control over your partitions with time unit partitioning, the general availability of Table ACLs, and BigQuery System Tables Reports, a solution that aims to help you monitor BigQuery flat-rate slot and reservation utilization by leveraging BigQuery’s underlying INFORMATION_SCHEMA views. Read the blog.
      • Cloud Code makes YAML easy for hundreds of popular Kubernetes CRDs—We announced authoring support for more than 400 popular Kubernetes CRDs out of the box, any existing CRDs in your Kubernetes cluster, and any CRDs you add from your local machine or a URL. Read the blog.
      • Google Cloud’s data privacy commitments for the AI era—We’ve outlined how our AI/ML Privacy Commitment reflects our belief that customers should have both the highest level of security and the highest level of control over data stored in the cloud. Read the blog.

      • New, lower pricing for Cloud CDN—We’ve reduced the price of cache fill (content fetched from your origin) charges across the board, by up to 80%, along with our recent introduction of a new set of flexible caching capabilities, to make it even easier to use Cloud CDN to optimize the performance of your applications. Read the blog.

      • Expanding the BeyondCorp Alliance—Last year, we announced our BeyondCorp Alliance with partners that share our Zero Trust vision. Today, we’re announcing new partners to this alliance. Read the blog.

      • New data analytics training opportunities—Throughout October and November, we’re offering a number of no-cost ways to learn data analytics, with trainings for beginners to advanced users. Learn more.

      • New BigQuery blog series—BigQuery Explained provides overviews on storage, data ingestion, queries, joins, and more. Read the series.

      Week of Oct 5-9, 2020

      • Introducing the Google Cloud Healthcare Consent Management API—This API gives healthcare application developers and clinical researchers a simple way to manage individuals’ consent of their health data, particularly important given the new and emerging virtual care and research scenarios related to COVID-19. Read the blog.

      • Announcing Google Cloud buildpacks—Based on the CNCF buildpacks v3 specification, these buildpacks produce container images that follow best practices and are suitable for running on all of our container platforms: Cloud Run (fully managed), Anthos, and Google Kubernetes Engine (GKE). Read the blog.

      • Providing open access to the Genome Aggregation Database (gnomAD)—Our collaboration with Broad Institute of MIT and Harvard provides free access to one of the world's most comprehensive public genomic datasets. Read the blog.

      • Introducing HTTP/gRPC server streaming for Cloud Run—Server-side HTTP streaming for your serverless applications running on Cloud Run (fully managed) is now available. This means your Cloud Run services can serve larger responses or stream partial responses to clients during the span of a single request, enabling quicker server response times for your applications. Read the blog.

      • New security and privacy features in Google Workspace—Alongside the announcement of Google Workspace we also shared more information on new security features that help facilitate safe communication and give admins increased visibility and control for their organizations. Read the blog.

      • Introducing Google Workspace—Google Workspace includes all of the productivity apps you know and use at home, at work, or in the classroom—Gmail, Calendar, Drive, Docs, Sheets, Slides, Meet, Chat and more—now more thoughtfully connected. Read the blog.

      • New in Cloud Functions: languages, availability, portability, and more—We extended Cloud Functions—our scalable pay-as-you-go Functions-as-a-Service (FaaS) platform that runs your code with zero server management—so you can now use it to build end-to-end solutions for several key use cases. Read the blog.

      • Announcing the Google Cloud Public Sector Summit, Dec 8-9—Our upcoming two-day virtual event will offer thought-provoking panels, keynotes, customer stories and more on the future of digital service in the public sector. Register at no cost.

    • Cloud TPU v4 records fastest training times on five MLPerf 2.0 benchmarks Wed, 29 Jun 2022 17:00:00 -0000

      Today, ML-driven innovation is fundamentally transforming computing, enabling entirely new classes of internet services. For example, recent state-of-the-art lage models such as PaLM and Chinchilla herald a coming paradigm shift where ML services will augment human creativity. All indications are that we are still in the early stages of what will be the next qualitative step function in computing. Realizing this transformation will require democratized and affordable access through cloud computing where the best of compute, networking, storage, and ML can be brought to bear seamlessly on ever larger-scale problem domains.  

      Today’s release of MLPerf™ 2.0 results from the MLCommons® Association highlights the public availability of the most powerful and efficient ML infrastructure anywhere. Google’s TPU v4 ML supercomputers set performance records on five benchmarks, with an average speedup of 1.42x over the next fastest non-Google submission, and 1.5x vs our MLPerf 1.0 submission. Even more compelling — four of these record runs were conducted on the publicly available Google Cloud ML hub that we announced at Google I/O. ML Hub runs out of our Oklahoma data center, which uses over 90% carbon-free energy

      Let’s take a closer look at the results.

      1 mlperf.jpg
      Figure 1: TPUs demonstrated significant speedup in all five published benchmarks over the fastest non-Google submission (NVIDIA on-premises). Taller bars are better. The numbers inside the bars represent the quantity of chips / accelerators used for each of the submissions.

      Performance at scale…and in the public cloud

      Our 2.0 submissions1, all running on TensorFlow, demonstrated leading performance across all five benchmarks. We scaled two of our submissions to run on full TPU v4 Pods. Each Cloud TPU v4 Pod consists of 4096 chips connected together via an ultra-fast interconnect network with an industry-leading 6 terabits per second (Tbps) of bandwidth per host, enabling rapid training for the largest models.

      Hardware aside, these benchmark results were made possible in no small part by our work to improve the TPU software stack. Scalability and performance optimizations in the TPU compiler and runtime, including faster embedding lookups and improved model weight distribution across the TPU pod, enabled much of these improvements, and are now widely available to TPU users. For example, we made a number of performance improvements to the virtualization stack to fully utilize the compute power of both CPU hosts and TPU chips to achieve peak performance on image and recommendation models. These optimizations reflect lessons from Google’s cutting-edge internal ML use cases across Search, YouTube, and more. We are excited to bring the benefits of this work to all Google Cloud users as well.

      2 mlperf.jpg
      Figure 2: Our 2.0 submissions make use of advances in our compiler infrastructure to achieve a larger scale and better per-chip performance across the board than previously possible, averaging 1.5x speedup over our 1.0 submissions2

      Translating MLPerf wins to customer wins

      Cloud TPU’s industry-leading performance at scale also translates to cost savings for customers. Based on our analysis summarized in Figure 3, Cloud TPUs on Google Cloud provide ~35-50% savings vs A100 on Microsoft Azure (see Figure 3). We employed the following methodology to calculate this result:2

      1. We compared the end-to-end times of the largest-scale MLPerf submissions, namely ResNet and BERT, from Google and NVIDIA. These submissions make use of a similar number of chips — upwards of 4000 TPU and GPU chips. Since performance does not scale linearly with chip count, we compared two submissions with roughly the same number of chips.

      2. To simplify the 4216-chip A100 comparison for ResNet vs our 4096-chip TPU submission, we made an assumption in favor of GPUs that 4096 A100 chips would deliver the same performance as 4216 chips.

      3. For pricing, we compared our publicly available Cloud TPU v4 on-demand prices ($3.22 per chip-hour) to Azure’s on-demand prices for A1003 ($4.1 per chip-hour). This once again favors the A100s since we assume zero virtualization overhead in moving from on-prem (NVIDIA’s results) to Azure Cloud.

      The savings are especially meaningful given that real-world models such as GPT-3 and PaLM are much larger than the BERT and ResNet models used in the MLPerf benchmark: PaLM is a 540 billion parameter model, while the BERT model used in the MLPerf benchmark has only 340 million parameters — a 1000x difference in scale. Based on our experience, the benefits of TPUs will grow significantly with scale and make the case all the more compelling for training on Cloud TPU v4.

      3 mlperf.jpg
      Figure 3: For the BERT model, using Cloud TPU v4 provides ~35% savings over A100, and ~50% savings for ResNet.4

      Have your cake and eat it too — a continued focus on sustainability

      Performance at scale must take environmental concerns as a primary constraint and optimization target. The Cloud TPU v4 pods powering our MLPerf results run with 90% carbon-free energy and a Power Usage Efficiency of 1.10, meaning that less than 10% of the power delivered to the data center is lost through conversion, heat, or other sources of inefficiency. The TPU v4 chip delivers 3x the peak FLOPs per watt relative to the v3 generation. This combination of carbon-free energy and extraordinary power delivery and computation efficiency makes Cloud TPUs among the most efficient in the world.4

      Making the switch to Cloud TPUs

      There has never been a better time for customers to adopt Cloud TPUs. Significant performance and cost savings at scale as well as a deep-rooted focus on sustainability are why customers such as Cohere, LG AI Research, Innersight Labs, and Allen Institute have made the switch. If you are ready to begin using Cloud TPUs for your workloads, please fill out this form. We are excited to partner with ML practitioners around the world to further accelerate the incredible rate of ML breakthroughs and innovation with Google Cloud’s TPU offerings.

      1. MLPerf™ v2.0 Training Closed. Retrieved from https://mlcommons.org/en/training-normal-20/ 29 June 2022, results 2.0-2010, 2.0-2012, 2.0-2098, 2.0-2099, 2.0-2103, 2.0-2106, 2.0-2107, 2.0-2120. The MLPerf name and logo are trademarks of MLCommons Association in the United States and other countries. All rights reserved. Unauthorized use is strictly prohibited. See www.mlcommons.org for more information.
      2. MLPerf v1.0 and v2.0 Training Closed. Retrieved from https://mlcommons.org/en/training-normal-20/ 29 June 2022, results 1.0-1088, 1.0-1090, 1.0-1092, 2.0-2010, 2.0-2012, 2.0-2120.
      3. ND96amsr A100 v4 Azure VMs, powered by eight 80 GB NVIDIA Ampere A100 GPUs (Azure’s flagship Deep Learning and Tightly Coupled HPC GPU offering with CentOS or Ubuntu Linux) is used for this benchmarking
      4. Cost to train is not an official MLPerf metric and is not verified by MLCommons Association. Azure performance is a favorable estimate as described in the text, not an MLPerf result. Computations are based on results from MLPerf v2.0 Training Closed. Retrieved from https://mlcommons.org/en/training-normal-20/ 29 June 2022, results 2.0-2012, 2.0-2106, 2.0-2107.

      Related Article

      Google demonstrates leading performance in latest MLPerf Benchmarks

      TPU v4 Pods will soon be available on Google Cloud, providing the most powerful publicly available computing platform for machine learnin...

      Read Article
    • Jumpstart your location experiences with new integrations from across Google Wed, 29 Jun 2022 16:00:00 -0000

      We’re always looking for new ways to help you more easily discover, explore and deploy recommended APIs for your mapping needs. Starting today, Google Maps Platform developers at small businesses and large companies alike have access to new features and integrations with other Google products to enhance their end-user experiences and back-end operations. 

      Easily get started with Tailored Recommendations

      If you’re new to Google Maps Platform, you’ll now see guided steps, based on your industry, from initial setup to implementation. For example, if you identify your industry as retail, you’ll receive relevant starter examples and documents to get you up and running right away. These recommendations will give you a jumpstart to explore the best products or solutions based on your industry. 


      Recommendations, such as codelabs and implementation guides, will surface in Cloud Console

      Keep your business locations automatically updated with Locator Plus

      It’s more important than ever to help customers find your business. Starting today, you can keep your store details updated with the Locator Plus solution. Locator Plus allows you to easily import business details from your Business Profile, explore, customize, and deploy place-based experiences with just a few clicks

      So now, a change in the business details of your Business Profile will be reflected in the store locator on your website. These changes can include hours, contact information, photos, service options, and more.

      Take, for example, Altostrat–a fictional store with eleven locations in the greater New York area. The owner, Val, deployed a store locator on her website. She decides to give all her employees a break over the holidays and closes the stores for a week. She goes into her Business Profile to edit her operating hours for that week.1 With the Locator Plus solution, Val doesn't have to manually enter the same Places details again on her website. Any updates to store details on her Business Profile will update in the Locator Plus solution that is live on her website. This new integration gives Val peace of mind that customers have the latest information when searching for her stores–regardless of where they get their information. 

      Business Profile Integration

      Easily manage your locations in Locator Plus by importing business details from your Business Profile

      Enable quick and easy appointment bookings

      Help your customers book appointments easily by embedding Reserve with Google booking capabilities into your map.2  Reserve with Google offers an end-to-end appointment booking flow, and connects users to a variety of services. Customers using a store locator are showing interest in visiting a store. By providing an option for bookings, right within the locator, you can create an easier booking process. Plus, it can offer you more insight into potential store traffic and a better understanding of services requested.


      Add booking capabilities into your store locator

      Deploy instantly with Google Cloud

      Starting today, we’re expanding ways for you to quickly update and roll out your store locator. In the Locator Plus solution, you can capture the location of every single store you want to show users – all within one map. With Google Cloud, you can now embed this store locator on your application. Previously, developers needed to integrate hundreds of lines of code to embed a store locator that captures each store location.

      For example, Val from Altostrat imports store details from her Business Profile into Locator Plus. Using Google Cloud, she can easily embed the store locator into her website and app–showing users where all of her eleven stores in the greater New York area are based.

      Hosted by Cloud
      Easily deploy your Locator Plus solution using Google Cloud

      Better understand the impact of your implementation

      A new store locator analytics dashboard will help you better understand the impact of your implementation and generate insights from your data. The analytics dashboard provides a picture of how well your site visitors are engaging with your store locator. You’ll be able to measure your performance week over week, including number of views, number of interactions with Search and Place Details, and overall engagement rate. The dashboard uses anonymized data to provide important benchmarks on how a developer’s implementation compares against other developers using the same solution.

      This will be the foundation for analytics that help retailers evaluate the ROI from each solution. Gaining this visibility allows your team to make smarter, informed decisions. 

      The store locator analytics dashboard provides you with performance metrics

      We’re continuously looking for ways to help you get the most out of your deployments from streamlining the getting started process to integrating helpful capabilities from across Google. Ready to start exploring? Try the Locator Plus experience now. To learn more about these features and integrations, sign up for the upcoming Maps OnAir webinar.


      1 User updates are subject to moderation; updates made on Business Profile, when accepted by Google, will be reflected in Place Details. 
      2 Reserve with Google is only available in certain countries/regions where businesses work with a supported booking provider. If you are interested in Reserve with Google, but are currently not working with a Reserve with Google partner, please direct your provider to submit their interest by completing this form and review our documentation to see how they can get started. If you don’t already work with a booking provider, you can see eligible providers in the Bookings section within Business Profile Manager.

    • IP Masquerading and eBPF are now in GKE Autopilot Wed, 29 Jun 2022 16:00:00 -0000

      So you’re deploying Kubernetes and you’ve been ready-to-go with your containerized applications. But one problem you’ve faced is IP exhaustion across your diverse environments and your clusters need to talk to your on-prem clusters or hosts. Or maybe your workloads talk to a service that expects only RFC 1918 addresses for regulatory or compliance reasons. 

      You can now translate your pod IPs to your node IPs on GKE Autopilot clusters with the latest networking features that are generally available:

      • Our Egress NAT policy with IP masquerading for pod to node IP translation is now GA for GKE Autopilot, and
      • Our advanced programmable datapath based on eBPF, Dataplane V2 (DPv2), with support for Network Policy & Logging is also now GA for GKE Autopilot.

      Egress NAT Policy for GKE Autopilot

      Egress NAT policy allows you to masquerade your pod IPs to the node IP addresses, enabling pods (typically in a separate network island) to communicate outside the cluster using the IP address of a node as the source IP. Some of our users have used special IPs (non-RFC 1918 addresses) for their pod ranges to expand their IP usage by leveraging Reserved or Class E IP space. A few use cases for wanting to masquerade the pod IPs to those of the nodes is for communication back to on-premise workloads for security or compliance reasons, or just for compatibility reasons. Previously, users were not able to configure IP masquerading due to managed namespace restrictions in GKE Autopilot. With the Egress NAT policy custom resource definition (CRD), we’ve enabled a user-facing API to allow you to configure IP masquerading on GKE Autopilot clusters. 

      "We use GKE Autopilot because of its reduced operational overhead and potential cost reductions. The addition of IP masquerading via Egress NAT policy expands our use of Autopilot to include accessing on-premises data and systems." —Joey Brown, Engineering Manager at American Family Insurance.

      Our long-term goal is to have the same API and feature set across GKE and Anthos platforms. We have extended Egress NAT policy in Anthos to provide NAT functionality based on K8s resources like namespaces and/or labels. This new Egress NAT policy on GKE Autopilot clusters provides source NAT controls to start. With this launch, we’re taking the initial step in achieving the first milestone on our roadmap.

      Cloud Composer 2, a Google managed workflow orchestration service built on Apache Airflow, uses GKE Autopilot under the hood. Cloud Composer 2 users also benefit from the introduction of Egress NAT policies to enable communication to various environments. 

      "We are a big Cloud Composer user as part of our GCP journey. We have dealt with IP shortages by using non-RFC 1918 address space for our GKE clusters. With Egress NAT policy, we can now use IP masquerading with Cloud Composer 2. Workloads using non-RFC 1918 addressing with Cloud Composer 2 are now able to make API calls to our wider Equifax applications. We are excited about using Egress NAT policies with Cloud Composer 2 to enable more of our applications on GCP."–Siddharth Shekhar, Site Reliability Engineer - Specialist at Equifax.

      Egress NAT policy is now generally available on GKE Autopilot clusters with DPv2 in versions 1.22.7-gke.1500+ or 1.23.4-gke.1600+. For configuration examples of Egress NAT policy, please refer to our how-to guide in the GKE documentation.

      GKE Autopilot with Dataplane V2 (DPv2)

      Have you been wanting to segregate your cluster workloads and understand when your Network Policies are enforced? GKE Autopilot now uses Dataplane V2 (DPv2) for container networking, a datapath integrated into Google infrastructure based on eBPF. With this advanced dataplane, you, as a GKE Autopilot user, can now take advantage of features like Network Policy and Network Policy Logging. 

      With DPv2 support, GKE Autopilot clusters can now benefit from the advantages that GKE standard clusters currently have with DPv2:

      • Security via Kubernetes Network Policy 
      • Scalability by removing iptables and kube-proxy implementations
      • Operational benefits with Network Policy Logging
      • Consistency with Anthos and GKE environments.

      Network Policy Logging enables security teams to audit logs and understand allowed or denied traffic flows based on existing Network Policies. It can be configured as an object on your GKE cluster and filtered per various parameters. The following is an example of a logged entry retrieved after an attempted access that was denied.

      [StructValue([(u'code', u"jsonPayload:\r\n connection:\r\n dest_ip:\r\n dest_port: 8080\r\n direction: ingress\r\n protocol: tcp\r\n src_ip:\r\n src_port: 46988\r\n count: 2\r\n dest:\r\n namespace: default\r\n pod_name: hello-web\r\n pod_namespace: default\r\n disposition: deny\r\n node_name: gk3-autopilot-cluster-1-nap-4lime7d7-dba77360-8td5\r\n src:\r\n namespace: default\r\n pod_name: test-1\r\n pod_namespace: default\r\nlogName: projects/PROJECT/logs/policy-action\r\nreceiveTimestamp: '2022-04-19T22:07:03.658959451Z'\r\nresource:\r\n labels:\r\n cluster_name: autopilot-cluster-1\r\n location: us-west1\r\n node_name: gk3-autopilot-cluster-1-nap-4lime7d7-dba77360-8td5\r\n project_id: PROJECT\r\n type: k8s_node\r\ntimestamp: '2022-04-19T22:06:56.139253838Z'"), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e454e188050>)])]

      Network Policy Logs are automatically uploaded to Cloud Logging and can also be retrieved via the Cloud Console Log Explorer. Network Policy metrics are also enabled with Dataplane v2 such that policy event metrics can be monitored even when Network Policy Logging is not enabled.

      GKE Autopilot uses DPv2 for all newly created clusters starting in GKE versions 1.22.7-gke.1500+ or 1.23.4-gke.1600+. For more information about Dataplane V2, check out our GKE Dataplane V2 docs

      Getting started with GKE Autopilot with DPv2 is as easy as entering the following gcloud command:

      [StructValue([(u'code', u'gcloud container clusters create-auto CLUSTER_NAME \\\r\n --region REGION \\\r\n --project=PROJECT_ID'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e454e188cd0>)])]

      To learn more about GKE Autopilot, check out our Overview page.

      Related Article

      Introducing GKE Autopilot: a revolution in managed Kubernetes

      GKE Autopilot gives you a fully managed, hardened Kubernetes cluster out of the box, for true hands-free operations.

      Read Article
    • Secure Supply Chain on Google Cloud Wed, 29 Jun 2022 15:00:00 -0000

      Securing your software requires establishing, verifying, and maintaining a chain of trust. That chain establishes the provenance or origin trail of your code, via attestations, generated and checked throughout your software development and deployment process. At Google, the internal development process enables a level of security, through code review, verified code provenance, and policy enforcement that minimizes software supply chain and related risks. These concepts go hand-in-hand with improved developer productivity. What are the security risk points in your software supply chain and how can you mitigate them with Google Cloud? Let’s explore! 

      Risk points for a software supply chain

      The software development and deployment supply chain is quite complicated, with numerous threats along the source code, build, and publish workflow. Here are some common threats that software development supply chains face:

      A. Submitting “bad” source code (includes compromising or coercing the developer)
      B. Compromising the source control platform, by gaining “admin” access for example 
      C. Injecting malicious behavior into the build pipeline, such as requesting a build from unsubmitted code or specifying build parameters that modify behavior
      D. Compromising the build platform to produce “bad” artifacts (In particular, many CI systems are not configured for “hostile multi-tenancy” within the same project, so an “owner” of a project can compromise their own builds without the team knowing.)
      E. Injecting malicious behavior through a dependency (same attacks recursively)
      F. Deploying a “bad” artifact by bypassing CI/CD
      G. Compromising the package manager / signing platform 
      H. Tricking users into using a “bad” resource instead of a legitimate one (for example, typosquatting)

      Also - Modifying an artifact in transit or compromising the underlying infrastructure of any of development lifecycle systems

      Secure Software Development Lifecycle

      How does Google secure the software supply chain internally?

      Google employs several practices to secure its software supply chain internally:

      Google Cloud is sharing these practices externally, so that the whole community can benefit. SLSA (Supply-chain Levels for Software Artifacts) is an end-to-end framework for supply chain integrity. It is an OSS-friendly version of what Google has been doing internally. In its current state, SLSA is a set of incrementally adoptable security guidelines being established by industry consensus. 

      How does Google Cloud help you secure your software supply chain?

      Securing your software supply chain involves defining, checking, and enforcing attestations across the software lifecycle. Here is how it works!

      How to secure software development lifecycle with Google Cloud?

      Binary Authorization

      A key element in software supply chain security is the Binary Authorization service, which establishes, verifies, and maintains a chain of trust via attestations and policy checks. Essentially, cryptographic signatures are generated as code or other artifacts move towards production. Before deployment, the attestations are checked based on policies. 

      Let’s walk through the steps of how to achieve ambient security in your developer process through policies and provenance on Google Cloud. The first step is understanding your supply or what libraries and frameworks you use to write your code.  


      Open source is heavily used in lots of software and it can be challenging to determine the risk of open source dependencies. To help address this challenge, we recently launched Open Source Insights, an interactive visualization site for exploring open source software packages. Open Source Insights is unique in that it provides a transitive dependency graph, with continuously updated security advisory, license, and other data across multiple languages in one place. In conjunction with open source scorecards, which provide a risk score for open source projects, Open Source Insights can be used by developers to make better choices across millions of open source packages.


      Once your code is checked in, it is built by Cloud Build. Here, another set of attestations are captured, adding to your chain of trust. Examples include what tests were run, what build tools and processes were used, and more. Cloud Build today helps with achieving a SLSA level 1, which denotes the level of security of your software supply chain. Cloud Build captures the source of the build configuration, which can be used to validate that the build was scripted (scripted builds are more secure than manual builds and this is a SLSA 1 requirement).  Also as required, these provenance and other attestations can be looked up using the container image digest, which is a unique signature for an image.

      Cloud Build is a fully managed cloud service. This means that in addition to developer agility, this service gives you a locked-down environment for securing builds, greatly reducing the risk of compromised build integrity or a compromised build system.

      You may also want to ensure you can enforce a security perimeter within your private network to keep data and access private. Cloud Build Private Pools adds support for VPC-SC and private IPs. You can take advantage of the locked down serverless build environment within your own private network.  

      Test and Scan

      Once the build is complete, it is stored in the Artifact Registry where it is automatically scanned for vulnerabilities. This generates additional metadata including an attestation for whether an artifact’s vulnerability results meet certain security thresholds. This information is stored by our container analysis service, which structures and organizes an artifact’s metadata, making it readily accessible to services like Binary Authorization.

      Deploy and Run

      Having built, stored, and scanned the images securely you are ready to deploy. At this point attestations captured along the supply chain are verified for authenticity by Binary Authorization. In enforcement mode, the image is deployed only when the attestations meet your organization's policy. In audit mode, policy violations are logged and trigger alerts. Binary Authorization is available for GKE and Cloud Run (preview)  ensuring only properly reviewed and authorized code gets deployed. Verification doesn't stop at deployment. Binary Authorization now also supports continuous validation, which ensures continued conformance to the defined policy even post-deployment. If a running application falls out of conformance with an existing or newly added policy an alert is triggered and logged. 

      That was an introduction to the current supply chain capabilities of Google Cloud. To learn more, check out Binary Authorization and SLSA.  

      For more #GCPSketchnote, follow the GitHub repo. For similar cloud content follow me on Twitter @pvergadia and keep an eye out on thecloudgirl.dev 

    • Introducing Query Insights for Cloud Spanner: troubleshoot performance issues with pre-built dashboards Wed, 29 Jun 2022 15:00:00 -0000

      Today, application development teams are more agile and are shipping features faster than ever before. In addition to these rapid development cycles and the rise of microservices architectures, the end-to-end ownership of feature development (and performance monitoring) has moved to a shared responsibility model between advanced database administrators and full-stack developers. However, most developers don’t have the years of experience or the time needed to debug complex query performance issues and database administrators are now a scarce resource in most organizations. As a result, there is a dire need for tools for developers and DBAs alike to quickly diagnose performance issues. 

      Introducing Query Insights for Spanner

      We are delighted to announce the launch of Query Insights for Spanner,  a set of visualization tools that provide an easy way for developers and database administrators to quickly diagnose query performance issues on Spanner. Using Query Insights, users can now troubleshoot query performance in a self-serve way. We’ve designed Query Insights using familiar design patterns with world-class visualizations to provide an intuitive experience for anyone who is debugging issues with query performance on Spanner. Query Insights is available at no additional cost.

      By using out-of-the-box visual dashboards and graphs, developers can visualize aberrant behavior like peaks and troughs in various performance metrics over a time-series and quickly identify problematic queries. Time series data provides significant value to organizations because it enables them to analyze important real-time and historical metrics. Data is valuable only if it’s easy to comprehend;. that’s where being able to view intuitive dashboards becomes a force multiplier for organizations looking to expose their time series data across teams.

      Follow a visual journey with pre-built dashboards

      With Query Insights, developers can seamlessly move from detection of database performance issues to diagnosis of problematic queries using a single interface. Query Insights will help identify query performance issues easily with pre-built dashboards. 

      The user could do this by following a simple journey where they can quickly confirm, identify and analyze query performance issues. Let’s walk through an example scenario. 

      Understand database performance

      This journey will start by the user setting up an alert on Google Cloud Monitoring for CPU utilization going above a certain threshold. The alert could be configured in a way that if this threshold is crossed, the user will be notified with an email alert, with a link to the “Monitoring” dashboard.

      Once the user receives this alert, they would click on the link in the email, and navigate to the “Monitoring” dashboard. If they observe high CPU Utilization and high read latencies, the possible root cause could be expensive queries. A spike in CPU Utilization could be a strong signal that the system is using more compute than it usually would, due to an inefficient query.

      The next step is to identify which query might be the problem, this is where Query Insights comes in. The user can get to this tool by clicking on Query Insights in the left navigation of your Spanner Instance. Here, they can drill down into the CPU usage by query and observe that for a specific database, CPU Utilization (attributed to all queries) is spiking for a particular time window. This confirms that the CPU utilization is due to inefficient queries.

      1 Query Insights.jpg

      Identifying a problematic query

      The user now observes the TopN (Top queries by CPU Utilization) query graph to see the TopN queries by CPU Utilization. From the graph, it is very easy to visualize and identify the top queries which could be causing the spike in CPU Utilization.

      2 Query Insights.jpg

      In the above screenshot, we can see that the first query in the table is showing a clear spike at 10:33 PM consuming 48.81% of total CPU. This is  a clear indication that this query could be problematic, and the user should investigate further.

      Analyzing the query performance

      Once they have identified the problematic query, they can now drill down into this query shape to confirm, identify the root cause of the high CPU utilization. 

      They can do this by clicking on the Fingerprint ID for the specific query from the topN table, and navigating to the Query Details page where they will be able to see a list of metrics (Latency, CPU Utilization, Execution count, Rows Scanned / Rows Returned) over a time series for that specific query.  

      In this example, we notice that the average number of rows scanned for this specific query are very high (~ 600k rows scanned to return ~ 12k rows), which could point to a poor query design, resulting in an inefficient query. We can also observe that latency is high (1.4s) for this query.

      3 Query Insights.jpg

      Fixing the issue

      To fix the problem in this scenario, the user could optimize this query by specifying a secondary index in the query using a FORCE_INDEX query hint to provide an index directive. This would provide more consistent performance, make the query more efficient, and lower CPU utilization for this query.

      In the screenshot below, you can see that after specifying the index in the query, the query performance dramatically increases in terms of CPU, rows scanned (54K vs 630k) and also in terms of query latency (536 ns vs 1.4 s).

      Unoptimized Query:

      4 Query Insights.jpg

      Optimized Query:

      5 Query Insights.jpg

      By following this simple visual journey, the user can easily detect, diagnose and debug inefficient queries on Spanner.

      Get started with Query Insights today

      To learn more about Query Insights, review the documentation here. Query Insights is enabled by default. In the Spanner console, you can click on Query Insights in the left navigation and start visualizing your query performance metrics! 

      New to Spanner? Get started in minutes with a new database.

      Related Article

      Improved troubleshooting with Cloud Spanner introspection capabilities

      Cloud-native database Spanner has new introspection capabilities to monitor database performance and optimize application efficiency.

      Read Article
    • CISO Perspectives: June 2022 Tue, 28 Jun 2022 19:00:00 -0000

      June saw the in-person return of the RSA Conference in San Francisco, one of the largest cybersecurity enterprise conferences in the world. It was great to meet with so many of you at many of our Google Cloud events, at our panel hosted in partnership with Cyversity, and throughout the conference. 

      At RSA we focused on our industry-leading security products, but even more importantly on our goal to make (and encourage others to make) more secure products, not just security products. 

      And remember, we make this newsletter available on the Google Cloud blog and by email—you can subscribe here.

      RSA Conference

      Those of us who attended RSA from Google Cloud were grateful for the chance to connect in person with so many of our customers, partners, and peers from across the industry. Some key themes Google Cloud discussed at press, analyst, government and customer meetings at the conference included: 

      • Digital sovereignty: How the cloud can be used to help organizations address and manage requirements around data localization, and achieve the necessary operational and software sovereignty. We believe that sovereignty is more than just meeting regulatory requirements. These principles can help organizations become more innovative and resilient while giving them the ability to control their digital future.

      • Defending against advanced threats: Organizations are operating against a backdrop of ever more advanced threats, and are looking to enhance their protection through capabilities like posture management and more pervasive implementation of Zero Trust capabilities. We also were focused on work to increase productivity and upskilling of threat management and security operations teams

      • Threat intelligence: A big part of supporting customers is ongoing interest in how we can further curate and release threat intelligence through our various products and capabilities

      These themes point to what security and tech decision-makers are looking for: secure products overall, not just security products. This is the backbone of our “shared fate” philosophy at Google Cloud. We know that in today’s environment, we can reduce and prevent toil for our customers by prioritizing security first, and building secure capabilities into all our products and solutions. 

      As RSA brings together incredible people and organizations, we also took stock of work happening across the industry to grow a more diverse cybersecurity workforce. We had the opportunity to host a panel discussion at Google’s San Francisco office with Cyversity and UC Berkeley’s Center for Long-Term Cybersecurity, two organizations who are deeply committed to advancing diversity in our industry.

      panel on diversity and cybersecurity.jpg
      MK Palmore, Director, Office of the CISO at Google Cloud, moderates a panel on diversity and cybersecurity with Ann Cleaveland, UC Berkeley; Rob Duhart, Walmart; and Larry Whiteside, Jr., Cyversity. Photo courtesy MK Palmore.

      One resounding takeaway was that diversity of background, experience, and perspective is vital for cybersecurity organizations to effectively manage risks, especially security risks. 

      As my colleague MK Palmore noted, so much of the threat landscape is about problem solving. This is why it’s imperative to bring different views and vantage points to address the most challenging issues. One way we can achieve this is through expanding the talent pipeline. Over one million cybersecurity positions go unfilled each year across the industry, so we need to actively introduce cybersecurity topics to students and new job seekers, including those who come to security from non-traditional backgrounds. 

      Progress requires a combination of private and public partnership, and organizations like Cyversity have established track records of providing women and individuals from underrepresented communities with the right resources and opportunities. As a company, Google is committed to growing a more diverse workforce for today and for the future. 

      Secure Products, not just Security Products

      Security should be built into all products. We all should be focused on constantly improving the base levels of security in all products. 

      One recent example is in our recent guide on how to incorporate Google Cloud’s new Assured Open Source Software service into your software supply chain. Assured OSS can provide you with a higher assurance collection of the open source software that you rely on. Additionally, we are working hard across all of our developer tooling to embed security capabilities, such as Cloud Build, Artifact Registry, and Container/Artifact Analysis.

      Google Cybersecurity Action Team Highlights

      Here are the latest updates, products, services and resources from our cloud security teams this month: 


      • Mapping security with MITRE: Through our research partnership with the MITRE Engenuity Center for Threat-Informed Defense, we have mapped the native security capabilities of Google Cloud to MITRE ATT&CK. This can help customers with their adoption of Autonomic Security Operations, which requires the ability to use threat-informed decision making throughout the continuous detection and continuous response (CD/CR) workflow. Read more.

      • Two new BigQuery capabilities to help secure and manage sensitive data: Managing data access continues to be an important concern for organizations and regulators. To fully address those concerns, sensitive data needs to be protected with the right mechanisms so that data can be kept secure throughout its entire lifecycle. We’re offering two new features in BigQuery that can help secure and manage sensitive data. Now generally available, encryption SQL functions can encrypt and decrypt data at the column level; and in preview is dynamic data masking, which can selectively mask column-level data at query time based on the defined masking rules, user roles, and privileges. 

      • Introducing Confidential GKE Nodes: Part of the growing Confidential Computing product portfolio, Confidential GKE Nodes make sure your data is encrypted in memory. GKE workloads you run today can run confidentially without any code changes.

      • Adding more granular GKE release controls: Customers can now subscribe their GKE clusters to release channels, so that they can decide when, how, and what to upgrade in clusters and nodes. These upgrade release controls can help organizations to automate tasks such as notifying their DevOps teams when a new security patch is available.

      • Detecting password leaks using reCAPTCHA Enterprise: We all know that reusing passwords is a risk. But as long as the password remains an unfortunately common form of account authentication, people will wind up reusing them. reCAPTCHA Enterprise’s password leak detection can help organizations warn their end-users to change passwords. It uses a privacy-preserving API which hides the credential details from Google’s backend services, and allows customers to keep their users’ credentials private. 

      • Database auditing comes to Cloud SQL: This security feature can let customers monitor changes to their Google Cloud SQL Server databases, including database creations, data inserts, and table deletions.

      • DNS zone permissions: Our Cloud DNS has introduced in Preview a new managed zone permissions capability that can allow enterprises with distributed DevOps teams to delegate Cloud DNS managed zone administration to their individual application teams. It can prevent one application team from accidentally changing the DNS records of another application, and it also can allow for a better security posture because only authorized users will be able to modify managed zones. This better supports the principle of least privilege.  

      • New capabilities in Cloud Armor: We’ve expanded Cloud Armor’s coverage to more types of workloads. New edge security policies can help defend workloads using Cloud CDN, Media CDN, and Cloud Storage, and filter requests before they are served from cache. Cloud Armor also now supports the TCP Proxy and SSL Proxy Load Balancers to help block malicious traffic attempting to reach backends behind these load balancers. We’ve also added features to improve the security, reliability, and availability of deployments, including two new rule actions for per-client rate limiting, malicious bot defense in reCAPTCHA Enterprise, and machine learning-based Adaptive Protection to help counter advanced Layer 7 attacks.

      Industry updates

      • How SLSA and SBOM can help healthcare resiliency: Healthcare organizations continue to be a significant target from many different threats and we are helping the healthcare industry develop more resilient cybersecurity practices. We believe part of developing that resiliency in the face of rising cyberattacks are software bills of materials (SBOM) and Supply chain Levels for Software Artifacts (SLSA) framework. Securing the software supply chain is a critical priority for defenders and something Google is committed to helping organizations do, which we explain more in-depth in this deep dive on SLSA and SBOM.

      • Google Cloud guidance on merging organizations: When two organizations merge, it’s vital that they integrate their two cloud deployments in as securely a manner as possible. We’ve published these best practices that address some security concerns they may have, especially around Identity and Access Management. 

      • Stronger privacy controls for the public sector: Google Workspace has added client-side encryption to let public agencies retain complete confidentiality and control over their data by choosing how and where their encryption keys are stored. 

      Compliance & Controls

      • Google Cloud security overview: Whether your organization is just getting started with its digital transformation or is running on a mature cloud, this wonderfully-illustrated summary of how Google Cloud security works is a great way for business and dev teams to help explain what Google Cloud security can do to make your organization more secure.  

      • New commitments on processing of service data for Google Cloud customers: As part of our work with the Dutch government and its Data Protection Impact Assessment (DPIA) of Google Workspace and Workspace for Education, Google intends to offer new contractual privacy commitments for service data that align with the commitments we offer for customer data. Read more.

      • Google Cloud’s preparations to address DORA: Google Cloud welcomes the inter-institutional agreement agreed to by European legislators on the Digital Operational Resilience Act (DORA). This major milestone in the adoption of new rules designed to ensure financial entities can withstand, respond to, and recover from all types of information and communications technology-related disruptions and threats, including increasingly sophisticated cyberattacks. Read more

      Google Cloud Security Podcasts

      We launched in February 2021 a new podcast focusing on Cloud Security. If you haven’t checked it out, we publish four or five podcasts a month where hosts Anton Chuvakin and Timothy Peacock chat with cybersecurity experts about the most important and challenging topics facing the industry today. This month, they discussed:

      • What good detection and response looks like in the cloud, with Dave Merkel and Peter Silberman, who lead managed detection and response company Expel. Listen here.

      • How Google runs “red team” exercises, with our own Stefan Friedli, senior security engineer. Listen here

      • Anton and Timothy’s reactions to RSA 2022. Listen here.

      • How best to observe and track cloud security threats, with James Condon, director of security research at cloud security startup Lacework. Listen here.

      • And everything you wanted to know about AI threats but might’ve been afraid to ask, with Nicholas Carlini, research scientist at Google. Listen here.

      To have our Cloud CISO Perspectives post delivered every month to your inbox, sign up for our newsletter. We’ll be back next month with more security-related updates.

      Related Article

      Cloud CISO Perspectives: May 2022

      Google Cloud CISO Phil Venables shares his thoughts on the latest security updates from the Google Cybersecurity Action Team.

      Read Article
    • Introducing Google Public Sector Tue, 28 Jun 2022 16:30:00 -0000

      Google Cloud has a long history of supporting and working with the governments in many different parts of the world, including the United States, Europe, Asia Pacific, Latin America, and Japan. We’ve helped government agencies modernize their core technology systems; transformed the way in which they deliver services via digital platforms to citizens; delivered security solutions to help agencies protect themselves from cyber attacks; provided communication, collaboration, and productivity tools to educational and healthcare systems; and enabled them to use data to improve financial systems and other critical infrastructure. We’ve offered products that address the unique needs of the public sector; built a dedicated sales force, partner ecosystem, and services organization; and worked with partners to bring joint solutions to government and educational institutions. 

      Today, we’re expanding this commitment in the United States with the creation of Google Public Sector, a new Google division that will focus on helping U.S. public sector institutions—including federal, state, and local governments, and educational institutions—accelerate their digital transformations. 

      This new division will operate as a subsidiary of Google LLC and will specialize in bringing Google Cloud technologies, including Google Cloud Platform and Google Workspace, to U.S. public sector customers. Google Public Sector will provide unique products and expertise, such as Google Cloud’s data and analytics platform, artificial intelligence (AI), and machine learning (ML) tools, so institutions can better understand their data and automate core processes. And the division will offer Google Cloud’s highly scalable and reliable open infrastructure, including compute, storage, and networking, so government agencies can modernize their legacy information systems and build new applications that serve citizens with mission-critical reliability and scalability. 

      Google Public Sector experts will help U.S. public sector customers use Google Cloud’s advanced cybersecurity products to protect their users, applications, and data from growing cyber threats. Our experts will assist agencies and educational institutions in their use of Google Workspace to enable secure communication and collaboration, and to attract new employees to the government through the use of these modern tools. And we will also continue to invest in training public sector employees in digital and cloud skills, and in expanding the thriving ecosystem of partners who are already working with Google Cloud to build solutions that meet the urgent and growing needs of U.S. public sector organizations.

      A new subsidiary to address growing customer needs

      Google Public Sector will provide a full complement of business functions and capabilities, including specialized sales, customer engineering, customer success and services, customer support, channel and partner programs, compliance, and security operations, so that our U.S. public sector customers can leverage the full range of technology offerings from Google Cloud. Google Public Sector will also operate in accordance with our existing principles.

      Google Public Sector will be led by Will Grannis, a longtime Google engineering and product veteran, who currently leads Google Cloud’s Office of the CTO. He will serve as head of the new division until a permanent Google Public Sector CEO is named. Lynn Martin will expand her remit to lead the full U.S. Public Sector Go-To-Market organization, which includes federal, state, and local customers, and educational entities, reporting to Will.  

      Will and Lynn, together with other leaders, will bring their deep domain experience in support of the new Google Public Sector division. These leaders include Troy Bertram, managing director of Google Cloud’s Public Sector Partner Ecosystem and former general manager for Worldwide Public Sector at AWS; Jeanette Manfra, senior director of Global Risk and Compliance and former Assistant Director for Cybersecurity at the U.S. Department of Homeland Security; Joel Minton, Google Cloud technical director for the Office of the CTO and former executive director of Login.gov at The White House; Phil Venables, Google Cloud CISO and current advisor on the President’s Council of Advisors on Science and Technology; and many others.

      Consistent with government divisions of other technology companies, Google Public Sector will also have a separate board of directors. The board will serve as an important feedback channel, ensuring Google Public Sector products and services meet the needs of our customers, help us anticipate future needs, and drive differentiation in the U.S. public sector market. The board of directors will have a chair and additional members, to be named later this year.

      Customer momentum in the U.S. public sector

      Over the last several years, we’ve continued to help customers across all levels of the U.S. government on their cloud initiatives and digital transformations. We have undertaken a number of projects with the armed services. For example, we worked with the U.S. Air Force to accelerate collaboration and research, assist with aircraft maintenance, and transform pilot training. We partnered with the U.S. Navy to use Google AI and ML tools to reduce corrosion on ships. And we supported the Defense Innovation Unit in its implementation of our secure cloud management solution—a scalable, highly responsive architecture for managing government network security.

      On the civilian side, we announced a partnership with the U.S. Department of Veteran Affairs to improve veteran access to benefits and services. We signed a five-year agreement with the U.S. Department of Energy to provide a broad range of Google Cloud technologies to help DoE scale its research efforts across national labs and field sites. We worked with the U.S. Patent and Trademark Office to help the Office’s more than 9,000 patent examiners rapidly perform more patent searches using AI tools. We partnered with the U.S. Postal Service to improve its customer service across web, mobile, messaging, and call centers. And we recently drove an initiative with the U.S. Forest Service to use Google tools to analyze the impact of environmental change. 

      In state and local government, we partnered with the State of Wisconsin and State of Rhode Island to launch brand-new virtual career centers—built on Google Cloud—for job seekers. We helped the State of New York launch a streamlined unemployment application and New York City Cyber Command more quickly detect, prevent, and respond to cyberthreats through high-performance cloud services. We assisted the State of West Virginia in its transition to Google Workspace from its legacy productivity provider, saving millions of dollars. And we partnered with the City of Pittsburgh to move the city to modern, cloud-based IT infrastructure, improving the delivery of services for citizens. And just last week, we announced a new platform called Climate Insights, designed to help government agencies quickly understand and respond to climate change issues.

      Google Cloud is partner-first, and Google Public Sector will operate in a similar manner, bringing customers an expanded ecosystem of partners with extensive expertise in serving public sector institutions. These include Google-certified system integrators like Accenture Federal, Deloitte, ManTech, and World Wide Technology (WWT); resellers like Carahsoft; and independent software vendors like C3.ai and SAP. We will continue to invest in our programs and these partners, who work closely with Google Public Sector to build and deliver technology to U.S. public sector institutions.

      Zero-trust security and certifications at its core

      With Google Public Sector, our plan is to continue down the path of achieving the highest levels of U.S. government certifications and requirements possible. This means the division will have the capability to manage sensitive government data, and we are committed to protecting this data through our secure, zero-trust-based infrastructure. 

      Earlier this year, we announced that Google Workspace had achieved FedRAMP High authorization, a major milestone in giving the U.S. government more choice among productivity software offerings. Today, Google Cloud has more than 100 products authorized at FedRAMP High or FedRAMP Moderate. This includes products for collaboration, security, document translation, and many other use cases. We also have Impact Level 4 (IL4) certification across numerous products, which allows U.S. government agencies to store and process controlled unclassified information across our key cloud services. 

      In addition, we continue to support the federal government through enhanced security solutions like Assured Workloads, which enables agencies to confidently secure and configure sensitive workloads in support of compliance requirements; BeyondCorp, a zero-trust solution that enables secure access to applications and cloud resources with integrated threat and data protection; Chronicle, a cloud-based threat detection and response solution; and unique dedicated teams such as the Google Cybersecurity Action Team, which has the mission of supporting the security and digital transformation of governments, critical infrastructure, enterprises, and small businesses. 

      We believe Google Public Sector can and will play a critical role in applying cloud technology to solve complex problems for our nation—across U.S. federal, state, and local governments, and educational institutions. The government has asked for more choice in cloud vendors who can support its missions, and protect the health, safety, and security of its citizens. We are proud to have served the U.S. public sector for many years, and the launch of Google Public Sector will help us rapidly expand our services to the government, now and into the future.

    • Introducing new Cloud Armor features including rate limiting, adaptive protection, and bot defense Tue, 28 Jun 2022 16:00:00 -0000

      As cyberattacks grow in complexity and intensity against cloud customers, they need their cloud providers to play an even more active role in the resiliency of their web applications and APIs. Attacks have evolved from isolated DDoS attempts to far more comprehensive and coordinated techniques, including volumetric flood DDoS attacks, bot attacks, and API abuse. 

      Google Cloud Armor can help our customers counter these growing security threats to web-applications and services by empowering defenders to deploy a defense-in-depth strategy. 

      Today, we are proud to announce the General Availability of new capabilities in Cloud Armor that can greatly improve the security, reliability, and availability of deployments, including:

      Also today, we are announcing the availability of new Cloud Armor features in Preview, including:

      These new capabilities help provide enterprise-ready DDoS protection and web application firewall (WAF) solutions at planet-scale for our customers’ workloads, be they located on-premises, in colocation, or in any public cloud.


      "The risk of a malicious distributed denial of service attack is notoriously hard to mitigate for most organizations,” explained Chris Aitchison, CTO of Australia’s Up, with the highest rated banking app in Australia (4.6 on Google Play). “Google handles a significant amount of the world's internet traffic, and Up is extremely comfortable relying on Google Cloud Armor to give us world-class protection in this space."

      Google Cloud Armor

      Cloud Armor is the DDoS mitigation service and WAF that leverages Google’s planet-scale infrastructure to help protect your websites and applications from volumetric, protocol-based, and application-level DDoS attacks. Cloud Armor can also filter Layer 7 network traffic to mitigate OWASP Top 10 risks, whether the apps are deployed on Google Cloud, in hybrid environments, or in a multi-cloud architecture. 

      Rate limiting 

      Web applications are frequently targeted by high-volume L7 requests like HTTP floods with the intent to make target service unavailable, or by lower volume, abusive user behavior such as credential stuffing. To mitigate such attacks, customers often need to limit the rate of requests that their applications and services receive. 

      With Cloud Armor’s new rate limiting capability, customers can curtail traffic to backend resources based on request volume, and prevent unwelcome traffic from overconsuming resources or affecting service availability. When Cloud Armor users configure its rules at the edge of their network, it can enable them to protect their applications from unpredictably noisy clients.

      In addition to providing the ability to rate limit web traffic, you also have the ability to rate limit at the connection level with Cloud Armor for TCP/SSL Proxy.

      Fig 1. Cloud Armor Per-Client Rate Limiting.jpg
      Figure 1. Cloud Armor Per-Client Rate Limiting

      Google Cloud Armor has two type of rate-based rules:

      • Throttle: You can enforce a maximum request limit, or connection limits, per client by throttling individual clients to a user-configured threshold, or enforce a maximum request count across all clients based on a HTTP request property.

      • Rate-based ban: You can rate-limit requests or connections that match a rule on a per-client basis and then temporarily ban those clients for a configured period of time if they exceed the user-configured threshold.

      Customers use Cloud Armor rate limiting to help prevent abusive behavior on a per-client basis such as  brute force login attempts against their sites. Similarly, the rate limiting rules can be applied more narrowly using the CEL-based custom rules language to enforce different rate limits to, for example,  different countries where they don’t have any (or many) customers, and use rate limiting to throttle those attacks.

      Up’s security team created a monitoring and alerting system when they first deployed rate limiting protections, using Cloud Logging and Cloud Monitoring. "We use rate limiting with Cloud Armor in various parts of our platform and we have found it effective in identifying malicious clients and stopping them in their tracks," said Aitchison.

      Cloud Armor Bot Management with reCAPTCHA Enterprise 

      Last fall, we announced the preview of Cloud Armor bot management with reCAPTCHA Enterprise. Now generally available, these new features in Cloud Armor can provide mitigation against bot attacks, credential stuffing, scraping, inventory hoarding attacks and other types of fraudulent transactions for our customers. This capability is powered by reCAPTCHA Enterprise with intelligence derived from more than 5 million websites running reCAPTCHA.

      Fig 2. Cloud Armor Advanced Bot Management.jpg
      Figure 2. Cloud Armor Advanced Bot Management with reCAPTCHA Enterprise Integration

      Our bot management solution integrates Cloud Armor and reCAPTCHA Enterprise to support  several use cases. The first allows Cloud Armor to help enforce reCAPTCHA’s more “frictionless” assessment, as it’s referred to, where end users do not have to identify images or decipher text before proceeding on a web page. This assessment automatically deciphers the reCAPTCHA Enterprise token that can indicate if a bot is detected or suspected. Security teams can then create Cloud Armor rules to block access to or redirect suspected bots to alternate content based on the reCAPTCHA’s risk score. 

      The second use case is a more traditional manual challenge where Cloud Armor can serve a reCAPTCHA challenge to the user when they trigger a WAF rule. In this case, the end user would have to pass the challenge in order for their request to be allowed through to the target application. This method allows for a manual user challenge before their access is blocked, offering flexibility to Cloud Armor users to create their own reCAPTCHA WAF site key and train a security model specific to that key.

      Finally, users can also combine both of the above use cases, the frictionless assessment and the manual challenge into the redemption flow. Cloud Armor policies can be configured to serve the manual challenge only to users that receive a high risk score from the reCAPTCHA frictionless assessment. This combination gives end-users the opportunity to redeem themselves by passing a manual challenge and getting through to the target application.

      Updated Preconfigured WAF Rules can help mitigate OWASP Top 10

      Cloud Armor now offers in preview new preconfigured WAF rules based on OWASP ModSecurity Core Rule Set (CRS) v3.3 in addition to the existing v3.0 rule sets to help our customers to mitigate the OWASP Top 10 vulnerabilities. With this preview release, customers are able to deploy the latest industry standard WAF signatures in Cloud Armor security policies to help selectively filter Layer7 traffic and protect their web apps and services from exploit attempts such as SQL injection (SQLi), cross-site scripting (XSS), or remote code execution (RCE). This predefined set of WAF rules allows customers to set a baseline level of protection for their public-facing endpoints. From there, they can use custom-defined WAF rules more specific to their web applications to further harden access and weed out unwanted connections. 

      One of Cloud Armor’s large enterprise customers has multiple platform teams, one for each major division. The team using Cloud Armor shared that its WAF defenses catch new threats and attacks before other security teams that aren’t using Cloud Armor, and sometimes even before they read about the indication of compromise in their threat intelligence feeds.

      Google Cloud Threat Intelligence for Cloud Armor now in Preview

      Another step we are making to empower a defense-in-depth strategy against common web application security threats is to introduce a preview release of Google Cloud Threat Intelligence for Cloud Armor. We heard from many of our customers that managing threat intelligence is becoming increasingly challenging due to the evolving nature of cybersecurity, and that’s why Cloud Armor now provides ready-to-use, continuously updated threat intelligence to help our customers enhance their network security. 

      With the new Google-curated Threat Intelligence, customers can configure security policies to filter traffic based on the following four categories: Tor exit nodes, malicious IPs, bad bots, and public cloud endpoints, with more categories planned in upcoming releases.

      Fig 3. Cloud Armor Network Threat Intelligence Categories.jpg
      Figure3. Cloud Armor Network Threat Intelligence Categories

      Cloud Armor Adaptive Protection

      Generally available since December 2021, our machine learning-powered Cloud Armor Adaptive Protection can provide customers additional capability to detect and mitigate suspicious Layer 7 traffic to their applications and services in real-time, using Google’s machine learning technology. You can learn more about it in this blog and this video.


      “Adaptive Protection alerts tell us exactly which one(s) of the dozens of attributes about an HTTP connection are out of bounds, and by how much relative to baseline traffic,” explained Fabio Coatti, Engineer at METRO Digital GmbH. "So far, the model has been very accurate, and our confidence in it continues to rise. This is key because mitigation can impact the experiences of legitimate customers. We want to ensure we act only on legitimate threats that might prevent our customers from using our portals.”

      Cloud Armor Helps Mitigate a Wide Array of Threats

      With all of the newly-added capabilities, Cloud Armor allows you to leverage the scale of Google’s global network while combining industry leading tools and tactics to help mitigate a wide array of threats to your internet-facing web applications and services.


      "We're extremely conservative about the traffic that we allow to cross our network boundaries. Cloud Armor allows us to enforce strict restrictions on traffic coming in and out of our platform with a simplicity that we didn't previously realize was possible," said Up’s Aitchison. “When we need to implement international sanctions regarding access to our network, Google Cloud Armor makes this extremely simple for us."

      In addition to advancing security capabilities, Cloud Amor recently expanded its workload coverage, further addressing CDN, Storage and additional Load Balancers, as you can read in this blog.

      To learn more, explore the following resources:

      Get better cost predictability and volume pricing, advanced features, and two additional support services with Cloud Armor Managed Protection Plus

      Related Article

      Announcing general availability of Cloud Armor’s new edge security policies, and support for proxy load balancers

      Google Cloud expands its scope of DDoS and web application firewall protection with new edge security policies and proxy load balancers.

      Read Article
    • Announcing general availability of Cloud Armor’s new edge security policies, and support for proxy load balancers Tue, 28 Jun 2022 16:00:00 -0000

      Whether workloads are deployed in public clouds, on-premises, or other infrastructure providers, DDoS and Layer 7 attacks target all web applications, APIs, and services. That’s why Google Cloud continues to expand our scope of DDoS and web application firewall (WAF) protection for web applications, APIs, and services with Google Cloud Armor, so customers can defend internet-facing workloads no matter which architecture they choose.

      Today, we are announcing two major capabilities that expand Cloud Armor’s coverage to more types of workloads. First, we are launching edge security policies to help defend workloads using Cloud CDN, Media CDN, and Cloud Storage and filter requests before they are served from cache. Second, Cloud Armor now supports the TCP Proxy and SSL Proxy Load Balancers, to help block malicious traffic attempting to reach backends behind these load balancers. 

      Many of our customers already protect mission-critical, internet-facing services with Cloud Armor. Cloud Armor provides advanced protection of web applications, services, and APIs against DDoS, Layer 7 attacks, and fraud from bots, at planet-scale, for hybrid, multi-cloud architectures.

      Extending Support for Hybrid, Multi-cloud, and High Performance Deployments

      Web teams want to accelerate deployments and optimize content delivery through the use of increasingly complex applications, and do so across a variety of load balancers and caches in hybrid, multi-cloud environments.

      Google Cloud has been steadily expanding its coverage for various types of workloads, adding the ability to help protect and provide services to more and more workloads in a broader set of locations, including those on Google Cloud, on-premise, in colocation, and in multiple public clouds. Cloud Armor can help customers to deploy a consistent, secure edge regardless of where their workloads ultimately reside. 

      Two years ago, Google Cloud introduced support for Internet Network Endpoint Groups (NEGs) and Hybrid NEGs, giving customers the ability to route traffic from Google Cloud’s edge to their workloads, either over the public internet, through private interconnect, or VPN tunnels. This update transformed Google Cloud Load Balancer and Cloud Armor into cloud services able to front and protect workloads wherever they may reside.

      In addition, for performance improvement and scale, customers are adopting a variety of Google Cloud services to accelerate and optimize application and content delivery, including:

      To further help our customers take on the challenge of protecting web applications and services across these diverse edge points, Cloud Armor now provides expanded coverage in two key areas: 

      • Edge Security Policies for Cloud CDN, Media CDN, and Cloud Storage

      • Support for both the TCP Proxy and SSL Proxy Load Balancers

      Edge Security Policies Filter Requests Before Serving From Cache

      Cloud Armor edge security policies allow customers to filter traffic before it is served from Cloud CDN and the newly releasedMedia CDN caches, as well as Cloud Storage backend buckets. Customers can also enforce geography-based access controls and security policies at the edge of the Google network and cache upstreams.

      1 application of Cloud Armor Edge Security Policy.jpg
      Figure 1. The application of Cloud Armor Edge Security Policy and Backend Security Policy

      The edge security policies and backend security policies can coexist on backend services fronted by the HTTP/S Load Balancer. When present, the edge security policies are evaluated so they can filter requests whether or not they would result in a cache hit and be served from cache. Backend security policies evaluate Cloud CDN Cache misses as well as requests for dynamic content that will be served from backend services, as seen above in Figure 1. For backend buckets, only edge security policies can be applied, then the traffic permitted through the policy will be forwarded by the load balancer to the destination bucket(s).  

      Logs from edge security policies can capture evidence that controls such as IP filtering and enforcement of location-based restrictions are in place to facilitate compliance audits and inspection of CDN and bucket-bound connections.

      Here is an example using Google Cloud Console to create an Edge Security Policy:

      2 Configuring Edge Security Policy in Cloud Armor.jpg
       Figure 2. Configuring Edge Security Policy in Cloud Armor

      Cloud Armor for TCP Proxy and SSL Proxy Load Balancers Helps Block Malicious Traffic

      In addition to the Google Cloud External HTTP Load Balancer, Google Cloud offers the SSL Proxy Load Balancer for encrypted (SSL and TLS), non-HTTP traffic, and the TCP Proxy Load Balancer. Customers can use these proxy endpoints to leverage the Google global load balancing infrastructure to serve TCP-based workloads that don’t use the HTTP protocol. Another use-case addressed here is for customers who prefer to handle their TLS/SSL offload downstream, or who require mTLS support.

      Cloud Armor support for both these load balancers is now available for customers. These capabilities allow customers to leverage the network edge to block and/or throttle potentially malicious traffic based on IP address and end-user geolocation. Customers can also enable per-connection rate-limiting to ensure no individual end-user sends more than the desired amount of traffic. Read more about our newly released rate-limiting capability in this blog

      Using the gcloud command line interface (CLI), customers can create security policies, enforce filters, and configure rate limits on new connection requests. Policies can be reused across HTTP/S load balancer and TCP/SSL proxy load balancers, leveraging fields like IP address, geolocation, and rate limiting rules to consolidate and simplify configuration and operations. In order to configure Cloud Armor to filter traffic headed to TCP and SSL Proxies, policies can be attached to a TCP/SSL Proxy Load Balancer-protected backend service. Connection Logs including Cloud Armor decisions can be sent to Cloud Logging by the TCP/SSL Proxy Load Balancers, which can be enabled via the CLI.

      You can create and enable your policy for TCP/SSL Proxy Load Balancers using the following gcloud CLI commands:

      1. Create a Cloud Armor security policy “example-one”:

      [StructValue([(u'code', u'gcloud compute security-policies create example-one \\\r\n --description "policy for tcp proxy rate limiting"'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e2d1d918050>)])]

      2. Add a rate limiting rule to Cloud Armor security policy “example-one”:

      [StructValue([(u'code', u'gcloud compute security-policies rules create 100 --security-policy=example-one \r\n--expression="true" \r\n--action=rate-based-ban \r\n--rate-limit-threshold-count=50 \r\n--rate-limit-threshold-interval-sec=60 \r\n--ban-duration-sec=300 \r\n--conform-action=allow \r\n--exceed-action=deny-404 \r\n--enforce-on-key=IP'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e2d1cb877d0>)])]

      3. Attach policy to TCP/SSL Proxy Load Balancer backend service “my-service-two”:

      [StructValue([(u'code', u'gcloud compute backend-services update my-service-two \r\n--security-policy example-one'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e2d1cb87410>)])]

      4. Enable logging on TCP/SSL Proxy Load Balancer “my-service-two”

      [StructValue([(u'code', u'gcloud beta compute backend-services update my-service-two \r\n--enable-logging \r\n--logging-sample-rate=1'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e2d13494390>)])]

      Edge Protection for More of Your Environments

      With Cloud Armor, your organization can benefit from DDoS protection and WAF. Cloud Armor now helps detect and mitigate attacks against both cache points and backend service workloads, including those load-balanced by External HTTP/S Load Balancer, as well as the TCP and SSL Proxy Load Balancers. And these workloads can run anywhere: on-prem, in colocation data centers, in Google Cloud, and on other cloud platforms.

      With Cloud Armor you also get a machine learning mechanism to identify and block Layer 7 DDoS attacks, the ability to mitigate OWASP Top 10 risks with pre-defined WAF rules, and bot management–so you can stop fraud at the edge. To learn more about Cloud Armor,visit our website

      Cloud Armor is also announcing a set of features that can improve network security. To learn more, explore the following resources:

      Get better cost predictability and volume pricing, advanced features, and two additional support services with Cloud Armor Managed Protection Plus.

      Related Article

      Introducing new Cloud Armor features including rate limiting, adaptive protection, and bot defense

      Cloud Armor strengthens its already formidable defenses with new features to counter advanced L7 attacks and block malicious bots.

      Read Article
    • Billing info is at your fingertips in the latest Cloud Console mobile app Tue, 28 Jun 2022 16:00:00 -0000

      Cloud billing is an important part of managing your cloud resources, and understanding your cloud spend estimates or accessing invoices is critical for many businesses. Thus far, the best way to check your billing information has been to use the Google Cloud Console from your favorite web browser. But Google Cloud users tell us that they want to be able to access billing data on the go.

      Today, we’re introducing a new way of accessing billing information — from the Cloud Console mobile app. Now, with your Android or iOS mobile device, you can access not only your resources (App Engine, Compute, Databases, Storage or IAM), logs, incidents, errors, but also your billing information. With these enhanced billing features, we are making it easier for you to understand your cloud spend.

      Billing in the Cloud Console mobile app

      With the newest app release, you can add a billing widget on the home dashboard of the Cloud Console mobile app using the “plus” button on the home screen. Whenever you open the app you will see the current spend of the selected billing account. You can also switch the active project from the home screen.

      Cloud Console mobile app.jpg

      If you go from the main screen to the Billing tab you can see your cost forecast. We also added sections to make navigation easier, while the Overview screen lets you see graphs of monthly trends, costs per project or per cloud service. 

      The Budgets screen, meanwhile, lets you preview how each of your predefined budgets are being spent. You can learn more about Cloud Billing budgets in this blog post.

      The new Credits page shows you the usage for all the credits that were ever used in your account, such as the one from Climate Innovation Challenge, and the Account management page shows you details about your billing account. You can also check the account’s id, which users are managing the billing accounts or which projects are using the active account.

      And if you ever need help, you can always reach out to us directly from the app using the “Help with billing” section on the Billing tab!

      To summarize we’ve enhanced billing on your smartphone with:

      • Smoother navigation
      • Forecasted cost
      • Access to billing graphs

      These new features are available for you to use today. If you have any feedback, we want to hear from you — just click the “Send feedback” button in the app. And don't forget to pin the dashboard card to the main screen, so you always have your billing information at your fingertips. Go ahead and download the app today from Google Play or the Apple App Store.

      Related Article

      Protect your Google Cloud spending with budgets

      Budgets are the first and simplest way to get a handle on your cloud spend. In this post, we break down a budget and help you set up aler...

      Read Article
    • Now in preview, BigQuery BI Engine Preferred Tables Tue, 28 Jun 2022 16:00:00 -0000

      Earlier in the quarter we had announced that BigQuery BI Engine support for all BI and custom applications was generally available. Today we are excited to announce the preview launch of Preferred Tables support in BigQuery BI Engine!  BI Engine is an in-memory analysis service that helps customers get low latency performance for their queries across all BI tools that connect to BigQuery.  With support for preferred tables,  BigQuery customers now have the ability to prioritize specific tables for acceleration, achieving predictable performance and optimized use of their BI Engine resources. 

      BigQuery BI Engine is designed to help deliver freshest insights without having to sacrifice the performance of their queries by accelerating their most popular dashboards and reports.  It provides intelligent scaling and ease of configuration where customers do not have to worry about any changes to their BI tools or in the way they interact with BigQuery. They simply have to create a project level memory reservation.  BigQuery BI Engine’s smart caching algorithm ensures that the data that tends to get queried often is in memory for faster response times.  BI Engine also creates replicas of the data being queried to support concurrent access, this is based on the query patterns and does not require manual tuning from the administrator.  

      However, some workloads are more latency sensitive than others.  Customers would therefore want more control of the tables to be accelerated within a project to ensure reliable performance and better utilization of their BI Engine reservations.  Before this feature,  BigQuery BI Engine customers could achieve this by using separate projects for only those tables that need acceleration. However, that requires additional configuration and not the best reason to use separate projects.

      With the launch of preferred tables in BI Engine, you can now tell BI Engine which tables should be accelerated.  For example, if you have two types of tables being queried from your project.  The first being a set of pre-aggregated or dimension tables that get queried by dashboards for executive reporting and the other representing all tables used for ad hoc analysis.  You can now ensure that your reporting dashboards get predictable performance by configuring them as ‘preferred tables’ in the BigQuery project.  That way, other workloads from the same project will not consume memory required for interactive use-cases. 

      Getting started

      To use preferred tables, you can use cloud console, BigQuery Reservation API or a data definition language (DDL) statement in SQL.  We will show the UI experience below.  You can look at detailed documentation of the preview feature here

      You can simply edit existing BI Engine configuration in the project.  You will see an optional step of specifying the preferred tables, followed by a box to specify the tables you want to set as preferred.

      1 BigQuery Reservation API.jpg
      2 BigQuery Reservation API.jpg

      The next step is to confirm and submit the configuration and you will be ready to go! 

      Alternatively, you can also achieve this by issuing a DDL statement in SQL editor as follows:

      [StructValue([(u'code', u'ALTER BI_CAPACITY `<PROJECT_ID>.region-<REGION>.default`\r\nSET OPTIONS(\r\n size_gb = 100,\r\n preferred_tables = ["bienginedemo.faadata.faadata1"]);'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3dfe546725d0>)])]

      This feature is available in all regions today and rolled out to all BigQuery customers. Please give it a spin!

      Related Article

      Learn how BI Engine enhances BigQuery query performance

      This blog explains how BI Engine enhances BigQuery query performance, different modes in BI engine and its monitoring.

      Read Article
    • Incorporating quota regression detection into your release pipeline Tue, 28 Jun 2022 16:00:00 -0000

      On Google Cloud, one of the ways an organization may want to enforce fairness in how much of a resource can be consumed is through the use of quotas. Limiting resource consumption on services is one way that companies can better manage their cloud costs. Oftentimes, people associate quotas with APIs to access that said resource. Although an endpoint may be able to handle a high number of Queries Per Second (QPS), the quota gives them a means to ensure that no one user or customer has monopoly over the available capacity. This is where fairness comes into play. It allows people to put limits that can be scoped per user or per customer and allows them to increase or lower those limits.

      Although quota limits address the issue of fairness from a resource providers’ point of view — in this case, Google Cloud — you still need a way as the resource consumer to ensure that those limits are adhered to and, just as importantly, ensure that you don’t inadvertently violate those limits. This is especially important in a continuous integration and continuous delivery (CI/CD) environment, where there is so much automation going on. CI/CD is heavily based on automating product releases and you want to ensure that the products released are always stable. This brings us to the issue of quota regression.

      What is quota regression and how can it occur? 

      Quota regression refers to the unplanned change in an allocated quota that oftentimes results in a reduced capacity for resource consumption. 

      Let's take for example an accountant firm. I have many friends in this sector and they can never hang out with me during their busy season between January and April. At least, that’s the excuse. During the busy season, they have an extraordinarily high caseload, and a low caseload the rest of the year. Let’s assume that these caseloads actually have an immediate impact on your resource costs on Google Cloud. Since this high caseload only occurs at a particular point throughout the year, it may not be necessary to maintain a high quota at all times. It’s not financially prudent since resources are paid on a “per-usage” model. 

      If the accountant firm has an in-house engineering team that has built load-tests to ensure the system is functioning as intended, you would expect the load capacity to increase before the busy season. If the load test is being done in an environment separate from the serving one (which it should be due to reasons such as security and avoiding unnecessary access grants to data), this is where you might start to see a quota regression. An example of this is load testing in your non-prod Google Cloud project (e.g.your-project-name-nonprod) and promoting images to your serving project (e.g.your-project-name-prod).

      In order for the load tests to pass, there must be a sufficient quota allocated to the load testing environment. However, there exists a possibility that that quota has not been granted in the serving environment. It could be due to simply an oversight in the process where the admin needed to request the additional quota in the serving environment, or it could be because that quota was reverted after a busy season and thus went unnoticed. Whatever the reason, it still depends on human intervention to assert that the quotas are consistent across environments. If this is missed, the firm can go into a busy season with passing load tests and still have a system outage due to lack of quota in the serving environment.

      Why not just use traditional monitoring?

      This brings to mind the argument of “Security Monitor vs Security Guard.” Even with monitoring to detect such inconsistencies, alerts can be ignored and alerts can be late. Alerts work if there is no automation tied to the behavior. In the example above, alerts may just suffice. However, in the context of CI/CD, it’s likely for a deployment that introduces a higher QPS on dependencies to be promoted from a lower environment to the serving environment, because the load tests pass if the lower environment has sufficient quota. The problem here is that now that deployment is automatically pushed to production with alerts probably occurring with the outage. 

      The best way to handle these scenarios is to incorporate an automated way of not just monitoring and alerting, but a means for preventing promotion of that regressive behavior to the serving environment. The last thing you want is new logic that requires a higher resource quota than what is granted being automatically promoted to prod.

      Why not use existing checks in tests? The software engineering discipline offers several types of tests (unit, integration, performance, load, smoke, etc…), none of which address something as complex as cross-environment consistency. Most of them focus on the user and expected behaviors. The only test that really focuses on infrastructure is the load test, but a quota regression is not necessarily part of the load test. It's not something you're going to detect since a load test occurs in its own environment and is agnostic of where it's actually running. 

      In other words, a quota regression test needs to be aware of the environments — it needs an expected baseline environment where the load test occurs and an actual serving environment where the product will be deployed. What I am proposing is an environment aware test to be included in the suite of many other tests.

      Quota regression testing on Google Cloud

      Google Cloud already provides services that you can use to easily incorporate this feature. This is more of a systems architecture practice that you can exercise. 

      The Service Consumer Management API provides the tools you need to create your own quota regression test. Take for example the ConsumerQuotaLimit Resource that’s returned via the list api. For the remainder of this discussion, let’s assume an environment setup such as this:

      extremely simple deployment pipeline.jpg
      Diagram demonstrating an extremely simple deployment pipeline for a resource provider.

      In the diagram above, we have a simplified deployment pipeline:

      1. Developers submit code to some repository

      2. The Cloud Build build and deployment trigger gets fired

        1. Tests are run

        2. Deployment images are pushed if the prerequisite steps succeed

      3. Images are pushed to their respective environments (in this case build to dev, and previous dev to prod)

      4. Quotas are defined for the endpoints on deployment

      5. Cloud Load Balancer makes the endpoints available to end users

      Quota limits

      With this mental model, let’s hone in on the role quotas play in the big picture. Let’s assume we have the following service definition for an endpoint called “FooService”. The service name, metric label and quota limit value are what we care about for this example.

      gRPC Cloud Endpoint Yaml Example

      [StructValue([(u'code', u'type: google.api.Service\r\nconfig_version: 3\r\nname: fooservice.endpoints.my-project-id.cloud.goog\r\ntitle: Foo Service gRPC Cloud Endpoints\r\napis:\r\n - name: com.foos.demo.proto.v1.FooService\r\nusage:\r\n rules:\r\n # ListFoos methods can be called without an API Key.\r\n - selector: com.foos.demo.proto.v1.FooService.ListFoos\r\n allow_unregistered_calls: true\r\n # GetFoo methods can be called without an API Key.\r\n - selector: com.foos.demo.proto.v1.FooService.GetFoo\r\n allow_unregistered_calls: true\r\n # UpdateFoo methods can be called without an API Key.\r\n - selector: com.foos.demo.proto.v1.FooService.UpdateFoo\r\n allow_unregistered_calls: true\r\nmetrics:\r\n - name: library.googleapis.com/read_calls\r\n display_name: "Read Quota"\r\n value_type: INT64\r\n metric_kind: DELTA\r\n - name: library.googleapis.com/write_calls\r\n display_name: "Write Quota"\r\n value_type: INT64\r\n metric_kind: DELTA\r\nquota:\r\n limits:\r\n - name: "apiReadQpmPerProject"\r\n metric: library.googleapis.com/read_calls\r\n unit: "1/min/{project}"\r\n values:\r\n STANDARD: 1\r\n - name: "apiWriteQpmPerProject"\r\n metric: library.googleapis.com/write_calls\r\n unit: "1/min/{project}"\r\n values:\r\n STANDARD: 1\r\n # By default, all calls are measured with a cost of 1:1 for QPM.\r\n # See https://github.com/googleapis/googleapis/blob/master/google/api/quota.proto\r\n metric_rules:\r\n - selector: "*"\r\n metric_costs:\r\n library.googleapis.com/read_calls: 1\r\n - selector: com.foos.demo.proto.v1.FooService.UpdateFoo\r\n metric_costs:\r\n library.googleapis.com/write_calls: 2'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e66240c2cd0>)])]

      In our definition we’ve established:

      • Service Name: fooservice.endpoints.my-project-id.cloud.goog

      • Metric Label: library.googleapis.com/read_calls

      • Quota Limit: 1

      With these elements defined, we’ve now restricted read calls to exactly one per minute for the service. Given a project number, (e.g., 123456789) we can now issue a call to the Consumer Quota Metrics Service to display the service quota.

      Example commands and output.

      [StructValue([(u'code', u'$ alias gcurl=\'curl -H "Authorization: Bearer $(gcloud auth print-access-token)" -H "Content-Type: application/json"\'\r\n$ gcurl https://serviceconsumermanagement.googleapis.com/v1beta1/services/fooservice.endpoints.my-project-id.cloud.goog/projects/my-project-id/consumerQuotaMetrics'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e66194cddd0>)])]

      Response example (truncated)

      [StructValue([(u'code', u'{\r\n "metrics": [\r\n {\r\n "name": "services/fooservice.endpoints.my-project-id.cloud.goog/projects/123456789/consumerQuotaMetrics/library.googleapis.com%2Fread_calls",\r\n "displayName": "Read Quota",\r\n "consumerQuotaLimits": [\r\n {\r\n "name": "services/fooservice.endpoints.my-project-id.cloud.goog/projects/123456789/consumerQuotaMetrics/library.googleapis.com%2Fread_calls/limits/%2Fmin%2Fproject",\r\n "unit": "1/min/{project}",\r\n "metric": "library.googleapis.com/read_calls",\r\n "quotaBuckets": [\r\n {\r\n "effectiveLimit": "1",\r\n "defaultLimit": "1"\r\n }\r\n ]\r\n }\r\n ],\r\n "metric": "library.googleapis.com/read_calls"\r\n }\r\n \u2026'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3e66194cd350>)])]

      In the above response, the most important thing to note is the effective limit for a given service’s metric. The effective limit is the limit being applied to a resource consumer when enforcing customer fairness as discussed earlier.

      Now that we’ve established how to get the effectiveLimit for a quota definition on a resource per project, we can define the assertion of quota consistency as: 

      Load Test Environment Quota Effective Limit <= Serving Environment Quota Effective Limit 

      Having a test like this, you can then integrate that with something like Cloud Build to block the promotion of your image from the lower environment to your serving environment if that test fails to pass. That saves you from introducing regressive behavior from the new image into the serving environment that would otherwise result in an outage. 

      The importance of early detection

      It’s not enough to alert on a detected quota regression and block the image promotion to prod. It’s better to raise alarms as soon as possible. If resources are lacking when it’s time to promote to production, you’re now faced with the problem of wrangling enough resources in time. This may not always be possible in the desired timeline; it’s possible that the resource provider needs to scale up its resources to handle the increase in quota. This is not always something that can just be done in a day. For example, is the service hosted on Google Kubernetes Engine (GKE)? Even with autoscale, what if the ip pool is exhausted? Cloud infrastructure changes, although elastic, are not instant. Part of production planning needs to account for the time needed to scale.

      In summary, quota regression testing is a key component that should be added to the entire concept of handling overload and dealing with load balancing in any cloud service — not just Google Cloud. It is important for product stability with the dips and spikes in demands, which will inevitably show up as a problem in many spaces. If you continue to rely on human intervention to ensure consistency of your quota across your configurations, you will only guarantee that eventually, you will have an outage when that consistency is not met. For more on working with quotas, check out the documentation.

      Related Article

      5 principles for cloud-native architecture—what it is and how to master it

      Learn to maximize your use of Google Cloud by adopting a cloud-native architecture.

      Read Article
    • Announcing MITRE ATT&CK mappings for Google Cloud security capabilities Tue, 28 Jun 2022 15:00:00 -0000

      The adoption of Autonomic Security Operations (ASO) requires the ability to use threat informed decision making throughout the continuous detection and continuous response (CD/CR) workflow. We are excited to facilitate this process by mapping native security capabilities of Google Cloud to MITRE ATT&CK® through our research partnership with the MITRE Engenuity Center for Threat-Informed Defense

      As a result, Google Cloud users can now evaluate the effectiveness of native security controls against specific ATT&CK® techniques. These mappings can increase your ability to develop better use cases and response playbooks, and identify how to improve security across your Google Cloud workloads. Application of the findings can enhance your ability to use our cloud native tools such as Chronicle, Siemplify, Security Command Center, and VirusTotal to defend your organization. The mappings include 49 Google Cloud security controls following a methodical scoring rubric.

      “Applying threat-informed defense is about using cyber threat intelligence to understand, prioritize, and improve our defensive capabilities. Mapping the native security controls of the Google Cloud to MITRE ATT&CK® is a foundational step that empowers defenders with an independent assessment of how Googler Cloud capabilities can defend against ATT&CK® techniques,” said Jon Baker, General Manager and Co-Founder, Center for Threat-Informed Defense.

      Mapping methodology

      The scoring methodology used is consistent with the Center’s previous work on similar mappings for other leading cloud provider security controls. The graphic below outlines the five main steps followed by the mapping methodology

      1. Identify the security controls that would be used and ensure that they are native to the platform. 
      2. Conduct extensive research on the functionality of the control and how that could be applied in ATT&CK. 
      3. Using the analysis conducted in step 2, map the control to an ATT&CK technique that the control mitigates. 
      4. Score the technique based on effectiveness.
      5. Produce the mapping files.
      1 MITRE ATT&CK mappings.jpg

       Scoring Rubric

      2 MITRE ATT&CK mappings.jpg

      The scoring rubric is centered around the continuous cycle of Protect, Detect, and Respond. Each control has been applied to one of these functions and a level of coverage. You will notice a commonality between this rubric and the CD/CR workflow of ASO. This is because at the heart of ASO is the ability to bring Cyber Threat Intelligence (CTI) into decisions and provide measurements that can create a feedback loop of improvement. The scoring produced in this project can improve this process for your security operations team.

      Google Cloud security stack mappings

      This process mapped Google Cloud native security controls to ATT&CK techniques. The graphic below is the ATT&CK Navigator Layer that visualizes these mappings. 

      Each color represents one of the areas of the rubric and its corresponding level of coverage. The included legend indicates the specific controls. Note that the purple shading represents areas where overlap among rubrics was observed. The layer shown below only shows the techniques, but the mapping also includes techniques when expanded. 

      In addition to viewing the data as a Navigator layer, there are also yaml files that can provide the complete data structure for each technique. The flow chart of this data structure is included next to the legend, and the source YAML data format can be found for each control.

      3 MITRE ATT&CK mappings.jpg
      4 MITRE ATT&CK mappings.jpg

      Next Steps

      The Google Cloud ATT&CK Mappings can be a key foundation for your application of ASO and can empower defenders to understand their impact on adversary behaviors and make threat-informed decisions. It is recommended that organizations take the time to assess each phase of the CD/CR pipeline, establish OKRs across core areas, and identify where they can improve the operationalization of ATT&CK mappings across their organizations. 

      We look forward to our continued investment in research initiatives to help democratize the journey towards Autonomic Security Operations. We will continue supporting community initiatives and cross-industry collaboration to help foster an improved state of security for the community at large. To learn more about the project and how you can get involved, read the MITRE Engenuity Center for Threat-Informed Defense release announcement.

      Related Article

      Security through collaboration: Building a more secure future with Confidential Computing

      Google Cloud, Project Zero, and AMD collaborated for several months to conduct a detailed review of the technology that powers Confidenti...

      Read Article
    • Google Cloud announces new products, partners and programs to accelerate sustainable transformations Tue, 28 Jun 2022 05:00:00 -0000

      At Google, we believe that the path to a sustainable future begins with the small decisions we make every day. But industries, governments and corporations are challenged to make these decisions without the right data or insights to inform them. Even a small choice for an organization — which raw material to choose for a new product, when to proactively water crops ahead of a drought, which green funds to invest in — requires understanding unique and often complex information. 

      Everyone wants to better understand how to become more sustainable, and take actions that have a meaningful impact. This year in the U.S., “how to reduce my carbon footprint” is being searched more than ever, and searches for “what is greenwashing” have increased five-fold over the past decade. Businesses and individuals alike are wondering how to turn sustainability ambition into action.

      1 accelerate sustainable transformations.jpg

      At the Google Cloud Sustainability Summit, we’re excited to expand our sustainability solutions, and launch new datasets, tools and partnership programs that can help make the sustainable choice the easy choice, for everyone. 

      Providing climate insights for every organization

      Last week we announced two new climate insights offerings for the public sector to help institutions better understand the risks to infrastructure and natural resources due to climate change. These insights can help governments transform the way they manage physical and natural resources, helping them become more climate-resilient. Every industry is also experiencing a new era of sustainability-driven transformation. Like with any transformation, how, why and what you transform needs to be informed by accurate data about your current state, and insights into the potential impact of your decisions. To help deliver these insights to all our customers, we’re excited to share that Google Earth Engine on Google Cloud is now generally available. 

      Google Earth Engine, which originally launched to scientists and NGOs in 2010, is a leading technology for planetary-scale environmental monitoring. Google Earth Engine combines data from hundreds of satellites and other sources with geospatial cloud computing resources to show timely, accurate, high-resolution and decision-relevant insights about the state of the world’s habitats and ecosystems — and how they’re changing over time. With one of the largest publicly available data catalogs and a global data archive that goes back 50 years and updates every 15 minutes, it’s possible to detect trends and understand correlations between human activities and environmental impact more precisely than ever before.

      2 accelerate sustainable transformations.jpg

      With Google BigQuery, Google Maps Platform and Earth Engine, Google provides a powerful combination of geospatial cloud products and solutions to serve customers’ location-aware analysis needs regardless of the scale, complexity or format of the data. This will enable customers like Regrow, a startup in the field of regenerative agriculture, to more easily contribute to our shared challenges around climate change and tackle their unique business challenges involving geospatial data. 

      “Regrow aims to make regenerative agriculture ubiquitous across the globe with an overall mission to mitigate climate change. That job has been made easier by Google Earth Engine, a platform which has allowed us to scale our technology and increase confidence in our data and reports," said Juan Delard de Rigoulieres Mantelli, CTO, Regrow 

      Sharing carbon-free energy insights with customers

      When we set out to use 24/7 carbon-free energy across our global operations by 2030, we knew that we would need better tools to track energy consumption and production. After all, you can’t manage what you don’t measure, and existing approaches to clean energy tracking were not designed to track hour-by-hour energy use. For the past 10 years, and together with our partners, we’ve collected insights and knowledge about how to progress our business towards a carbon-free future. We’re excited to start sharing 24/7 carbon-free energy insights with our Google Cloud customers through a new pilot program.

      With access to historical and real-time data, and at regional and hourly granularity, customers will see a clear picture of their electricity emissions profile. The pilot will enable customers to baseline their existing carbon-free energy (CFE) score and their scope 2 carbon footprint, help them forecast and plan for an optimized energy portfolio, and eventually execute on carbon-free energy transactions. 

      Sharing knowledge like this will be key to helping everyone reach ambitious net-zero targets. For example, companies like Iron Mountain are joining the Carbon-free Energy Compact to accelerate decarbonization. 

      “In 2021 we adopted the same 24/7 carbon-free energy goal pioneered by Google, and we recognize that the key to making progress towards this is access to good data and the ability to share that data with solution providers,” said Chris Pennington, Director of Energy and Sustainability at Iron Mountain. “Our early steps towards 24/7 have been enabled by key partners, including Google, who are providing us with the insights we need to evaluate our current performance and identify the next steps on our 24/7 journey. We place a great deal of value in collaboration to achieve better results, faster.”

      Expanding the Carbon Sense suite

      In the latest launch of the Carbon Sense suite of products we’re adding new data, expanding reporting coverage and making it easier for cloud architects and administrators to prioritize sustainability. 

      Last year we announced Carbon Footprint for Google Cloud, which helps companies measure, report and reduce the gross carbon emissions of using Google Cloud services. We’re excited that early next year, we’ll launch Carbon Footprint for Google Workspace, providing similar reporting functionality for the emissions associated with products like Gmail, Meet, Docs and others. 

      For sustainability teams that need to access the data in Carbon Footprint for reporting purposes, we’re also excited to launch a dedicated Identity and Access Management (IAM) role for Carbon Footprint. This will enable non-technical users of Google Cloud to easily access the emissions data and use it for tracking or in disclosures. You don’t need to be a cloud computing expert to view and export carbon emissions data associated with your cloud usage. 

      Shopify’s sustainability and IT teams are closely aligned on their sustainability goals. “Shopify is on a mission to be the lowest carbon commerce platform for millions of entrepreneurs around the world,” says Stacy Kauk, Head of Sustainability at Shopify. “Tools like Carbon Footprint allow our engineers to understand the carbon impact of our technology decisions, and ensure we continue to architect a fast, resilient and low-carbon commerce solution." 

      You also don’t need to be a sustainability expert to make sustainable computing choices. For IT teams, and the administrators and cloud architects within them, we’re introducing low-carbon mode, which enables you to restrict your cloud resources to the low-carbon locations across our infrastructure using new low-carbon locations value groups. One of the most impactful actions you can take to reduce the gross emissions of using Google Cloud is to prioritize the locations with more carbon-free energy powering our infrastructure. Relative to other choices, you may be able to lower carbon emissions by 5-10x.

      3 accelerate sustainable transformations.jpg

      One company that is putting emissions data in the hands of engineers is Uber. “At Uber we take sustainability seriously across the organization,” said Michael Sudakovich, Sustainable Engineering Lead and Senior Security Engineer at Uber. “From giving riders more sustainable choices to now giving our engineers data about their services' cloud emissions and recommendations on emission reduction, with Carbon Footprint. Helping everyone make more sustainable choices is a priority for all of our teams as we work to make Uber a zero-emission platform in Canada, Europe, and the US by 2030, and worldwide by 2040.”

      Finally, Carbon Footprint is adding both scope 1 and 3 emissions to its reporting data. These are the apportioned amounts of Google’s scope 1 and 3 emissions, associated with a customers use of Google Cloud. You can read a detailed explanation of different scopes of emissions here, but for a quick breakdown: Scope 1 emissions are from sources an organization controls directly; Scope 2 are associated with the production of energy used by the organization (those were already in Carbon Footprint); and scope 3 are indirect emissions from up and down the value chain. Users will soon have a comprehensive view of the emissions associated with their Google Cloud usage. 

      “At SAP, sustainability is core to our culture and operations and we ensure it is infused across the organization. Our SAP Cloud deployment strategy focuses on sustainable data centers to help achieve our commitment to net-zero by 2030. We are leveraging Carbon Footprint to understand, report, and reduce our gross carbon emissions associated with our Google Cloud usage. Google data centers help SAP, and our joint customers, make their entire value chains more sustainable,” said Tom Lee, Head of Multicloud Products and Services, SAP. 

      Growing our sustainability ecosystem

      The ecosystem of Google Cloud partners focused on sustainability continues to expand at a remarkable pace. The initiative, which brings technology providers together to help global businesses and governments accelerate sustainability programs, has added multiple new partners with innovative solutions. Today, we’re announcing two new programs to make it easier for partners to participate in the initiative, and for organizations to find the tools and expertise to help achieve their sustainability goals

      First, Google Cloud Ready - Sustainability is a new validation program for partners with a business-ready solution available on Google Cloud that helps customers achieve sustainability goals. Partners with the GCR-Sustainability designation deliver solutions that reduce carbon emissions, increase the sustainability of value chains, help organizations process ESG data or help them identify climate risks for increased resilience. 

      Carto, Climate Engine, NGIS, GEOTAB, Planet, Atlas AI, Electricity Map have already achieved their Google Cloud Ready - Sustainability designation. Many of these partners have expertise in next-generation technologies addressing ESG challenges such as geospatial or climate data and analytics. Providers like Dun & Bradstreet are excited about this new sustainability validation program.

      "As climate-related events increase in magnitude and frequency, it's imperative that we incorporate climate data into business risk management across company locations and supply chains. Programs like Google Cloud Ready for Sustainability accelerate access to solutions that can drive ESG transformations, such as applying climate-based risk factors alongside traditional financial considerations," said Rochelle March, Head of ESG Product at Dun & Bradstreet. 

      Cloud Ready for Sustainability is part of Google Cloud Partner Advantage, designed to maximize our partners' success across business models, customer requirements, success metrics, and strategic priorities. You can learn more about Google Cloud Ready for Sustainability and complete an application here.

      Second, we’re launching the Google Cloud Marketplace Sustainability Hub, providing customers with easy access to validated sustainability solutions. The Marketplace Sustainability Hub will showcase Google Cloud Ready for Sustainability solutions, which can be purchased directly from the site. Look for the Marketplace Sustainability Hub to launch soon.

      Don’t miss all the exciting content at the Sustainability Summit

      Tomorrow, June 28, we’re bringing technologists, developers, business and sustainability leaders together to learn how the climate leaders of today are building for the future. You can catch all the talks, films, presentations and demos here, so don’t miss out!

      Related Article

      Announcing new tools to measure—and reduce—your environmental impact

      Now you can evaluate and reduce the carbon footprint of your cloud workloads, and evaluate your environmental impact with Earth Engine.

      Read Article
    • Enterprise Data Integration with Data Fusion Mon, 27 Jun 2022 22:00:00 -0000

      A challenge with data analytics is that the data is stored in many places and is in different formats. As a result, you often need to complete numerous integration activities before you can start to gain insights from your data. Data Fusion offers a one-stop-shop for all enterprise data integration activities including ingestion, ETL, ELT and Streaming and with an execution engine optimized for SLAs and cost. It is designed to help make lives easier for ETL developers, data analysts, and data engineers on Google Cloud, Hybrid Cloud or Multi Cloud environments. 

      Data Fusion

      Data Fusion is Google’s cloud native, fully managed, scalable enterprise data integration platform. It helps bring transactional, social or machine data in various formats from databases, applications, messaging systems, mainframes, files, SaaS and IoT devices, offers an easy to use visual interface, and provides deployment capabilities to execute data pipelines on ephemeral or dedicated Dataproc clusters in Spark. Cloud Data Fusion is powered by open source CDAP which makes the pipelines portable across Google Cloud or Hybrid or multi cloud environments. 

      Data integration capabilities 

      Data integration for optimized analytics and accelerated data transformations

      • Data Fusion supports a broad set of more than 200 connectors and formats, which enables you to extract and blend data. You can develop data pipelines in a visual environment to help improve productivity. 

      • Data Fusion provides data wrangling capabilities to prepare data and provides capabilities to operationalize the data wrangling to help improve business IT collaboration. 

      • You can leverage the extensive REST API to design, automate, orchestrate and manage the lifecycle of the pipelines.

      • Data Fusion supports all data delivery modes including batch, streaming and real-time– making it a comprehensive platform to address both batch and streaming related use cases.

      • Data Fusion provides operational insights so that you can monitor data integration processes. Manage SLA’s and help optimize and fine tune integration jobs. 

      • Data Fusion provides capabilities to parse and enrich unstructured data using Cloud AI, for example, converting audio files to text, applying NLP to detect sentiment, or extracting features from images and documents or converting HL7 to FHIR formats.

      Data consistency

      Data Fusion helps to build confidence in business decision-making with advanced data consistency features: 

      • Data Fusion helps minimize the risk of mistakes by providing structured ways of specifying transformations, data quality checks with Wrangler, and predefined directives.

      • Data Fusion helps identify quality issues by keeping track of profiles of the data being integrated and enabling you make decisions based on data observability.

      • Data formats change over time, Data Fusion helps handle data drift with the ability to identify change and customize error handling.

      Metadata and modeling

      Data Fusion can help you gain insights with metadata:

      • You can collect technical, business, and operational metadata for datasets and pipelines and easily discover metadata with a search.

      • Data Fusion provides end-to-end data view to help understand you the data model, and to profile data, flows, and relationships of datasets.

      • Data Fusion enables exchange of metadata between catalogs and integration with end-user workbenches using REST APIs.

      The Data Fusion data lineage feature helps you to understand the flow of your data and how it is prepared for business decisions. 

      Data Fusion 1

      Open, hybrid, and multi-cloud

      Data Fusion is cloud-native and powered by CDAP, a 100% open-source framework for building on-premises and cloud data analytics applications. This means you can deploy and execute integration pipelines in different environments without any changes to suit business needs. 

      Data protection

      Data Fusion provides options for data security in the following ways:

      • It provides secure access to on-premises data with private IP.

      • It encrypts data at rest by default or with Customer Managed Encryption Keys (CMEK) to control across all user data in supported storage systems.

      • It provides data exfiltration protection via VPC Service Controls, a security perimeter around platform resources.

      • You can store sensitive passwords, URLs, and JDBC strings in Cloud KMS, and integrate with external KMS systems.

      • It integrates with Cloud DLP to mask, redact, and encrypt data in transit.


      Chances are that in your enterprise there is data siloed in various platforms. If it’s your job to bring it together, apply transformations, create data pipelines, and make all your data teams happier and more productive– then Cloud Data Fusion can help you achieve these goals. And if you already use Google Cloud data tools for curating a data lake with Cloud Storage and Dataproc, moving data into BigQuery for data warehousing, or transforming data for a relational store like Cloud Spanner, then Data Fusion integrations can help make development and iteration faster and easier. For a more in-depth look into Data Fusion check out the documentation

      For more #GCPSketchnote, follow the GitHub repo. For similar cloud content follow me on Twitter @pvergadia and keep an eye out on thecloudgirl.dev.

    Google has many products and the following is a list of its products: Android AutoAndroid OSAndroid TVCalendarCardboardChromeChrome EnterpriseChromebookChromecastConnected HomeContactsDigital WellbeingDocsDriveEarthFinanceFormsGboardGmailGoogle AlertsGoogle AnalyticsGoogle Arts & CultureGoogle AssistantGoogle AuthenticatorGoogle ChatGoogle ClassroomGoogle DuoGoogle ExpeditionsGoogle Family LinkGoogle FiGoogle FilesGoogle Find My DeviceGoogle FitGoogle FlightsGoogle FontsGoogle GroupsGoogle Home AppGoogle Input ToolsGoogle LensGoogle MeetGoogle OneGoogle PayGoogle PhotosGoogle PlayGoogle Play BooksGoogle Play GamesGoogle Play PassGoogle Play ProtectGoogle PodcastsGoogle ShoppingGoogle Street ViewGoogle TVGoogle TasksHangoutsKeepMapsMeasureMessagesNewsPhotoScanPixelPixel BudsPixelbookScholarSearchSheetsSitesSlidesSnapseedStadiaTilt BrushTranslateTravelTrusted ContactsVoiceWazeWear OS by GoogleYouTubeYouTube KidsYouTube MusicYouTube TVYouTube VR

    Google News

    Think with Google

    Google AI BlogAndroid Developers BlogGoogle Developers Blog
    AI is Artificial Intelligence

    Nightmare Scenario: Inside the Trump Administration’s Response to the Pandemic That Changed. From the Washington Post journalists Yasmeen Abutaleb and Damian Paletta - the definitive account of the Trump administration’s tragic mismanagement of the COVID-19 pandemic, and the chaos, incompetence, and craven politicization that has led to more than a half million American deaths and counting.

    Since the day Donald Trump was elected, his critics warned that an unexpected crisis would test the former reality-television host - and they predicted that the president would prove unable to meet the moment. In 2020, that crisis came to pass, with the outcomes more devastating and consequential than anyone dared to imagine. Nightmare Scenario is the complete story of Donald Trump’s handling - and mishandling - of the COVID-19 catastrophe, during the period of January 2020 up to Election Day that year. Yasmeen Abutaleb and Damian Paletta take us deep inside the White House, from the Situation Room to the Oval Office, to show how the members of the administration launched an all-out war against the health agencies, doctors, and scientific communities, all in their futile attempts to wish away the worst global pandemic in a century...


    ZDNet » Google

    9to5Google » Google

    Computerworld » Google

    • FCC commissioner wants Apple, Google to remove TikTok from App Stores Wed, 29 Jun 2022 05:34:00 -0700

      FCC Commissioner Brendan Carr has written to Apple and Google to request that both companies remove the incredibly popular TikTok app from their stores, citing a threat to national security.

      Is your data going TikTok?

      Carr warns the app collects huge quantities of data and cited a recent report that claimed the company has accessed sensitive data collected from Americans. He argues that TikTok’s, "pattern of conduct and misrepresentations regarding the unfettered access that persons in Beijing have to sensitive U.S. data...puts it out of compliance,” with App Store security and privacy policies.

      To read this article in full, please click here

    • Italian spyware firm is hacking into iOS and Android devices, Google says Fri, 24 Jun 2022 08:51:00 -0700

      Google's Threat Analysis Group (TAG) has identified Italian vendor RCS Lab as a spyware offender, developing tools that are being used to exploit zero-day vulnerabilities to effect attacks on iOS and Android mobile users in Italy and Kazakhstan.

      According to a Google blog post on Thursday, RCS Lab uses a combination of tactics, including atypical drive-by downloads, as initial infection vectors. The company has developed tools to spy on the private data of the targeted devices, the post said.

      To read this article in full, please click here

    • 14 ways Google Lens can save you time on Android Wed, 22 Jun 2022 03:00:00 -0700

      Psst: Come close. Your Android phone has a little-known superpower — a futuristic system for bridging the physical world around you and the digital universe on your device. It's one of Google's best-kept secrets. And it can save you tons of time and effort.

      It's a little somethin' called Google Lens, and it's been lurking around on Android and quietly getting more and more capable for years. Google doesn't make a big deal about it, weirdly enough, and you really have to go out of your way to even realize it exists. But once you uncover it, well, you'll feel like you have a magic wand in your pocket.

      At its core, Google Lens is best described as a search engine for the real world. It uses artificial intelligence to identify text and objects both within images and in a live view from your phone's camera, and it then lets you learn about and interact with those elements in all sorts of interesting ways. But while Lens's ability to, say, identify a flower, look up a book, or give you info about a landmark is certainly impressive, it's the system's more mundane-seeming productivity powers that are far more likely to find a place in your day-to-day life.

      To read this article in full, please click here

    • The killer calendar app your Chromebook's been missing Wed, 15 Jun 2022 03:00:00 -0700

      Let me just go on the record as saying: The Google Calendar website is fine.

      And fine really is the most appropriate word here. Google's default desktop Calendar interface is perfectly functional, and it gets the job done.

      It's good enough, in fact — until you experience a truly exceptional Chrome OS calendar alternative and realize how much more efficient, effective, and generally enjoyable your Chromebook-based agenda juggling could be.

      I've been raving endlessly about my favorite Google-connecting desktop calendar app of the moment, the recently-acquired Cron, and lemme tell ya: Phenomenal doesn't even begin to describe it.

      To read this article in full, please click here

    • 6 custom Android shortcuts that'll supercharge your efficiency Wed, 08 Jun 2022 03:00:00 -0700

      Quick: When's the last time you really, truly thought about your Android phone's Quick Settings setup?

      If you're like most mammals I know, the answer probably ranges somewhere between "eons ago" and "never." And it's no surprise: Android's Quick Settings area is one of those things that's just sort of there. It's convenient, sure, but it's all too easy to forget that it's completely customizable — and expandable, too. It can turn into an invaluable home for your own custom Android shortcuts, if you take the time to build it up accordingly.

      The challenge, aside from simply remembering that you can expand that area of your phone's interface, is knowing where to begin. Google doesn't exactly have any great way of tracking down and identifying apps that offer Quick Settings additions, and even when you have an app with a cool Quick Settings option on your phone, you might not even realize it's there.

      To read this article in full, please click here

    • 6 secret shortcuts in Chrome on Android Wed, 01 Jun 2022 03:00:00 -0700

      Goodness gracious, I sure do love saving seconds. And if there's one area where wasted moments are just begging to be reclaimed, it's within the shiny Chrome browser on your favorite Android phone.

      Google's Android Chrome app is an absolute gold mine when it comes to hidden shortcuts and underappreciated time-savers. And despite the fact that we went over a ton of top-notch time-savers for the Chrome Android environment a handful of months back, I kept thinking to myself: "Gee wilikers, Mr. Wigglesby, there's gotta be more."

      To read this article in full, please click here

    • 6 Android settings for smarter notifications Fri, 27 May 2022 02:45:00 -0700

      Ah, notifications. Has any other technological wonder managed to be so incredibly helpful and so impossibly irritating at the same time?

      Notifications truly are one of our smartphones' greatest strengths — and one of their most irksome annoyances. They keep us connected to important info and yet also keep us tethered to our digital lives at the most inopportune times.

      Here in the land o' Android, notifications are actually designed in a sensible way that makes 'em reasonably easy to manage and customize. (The same can't be said for, ahem, certain other smartphone platforms.)

      But taking total advantage of Android's notification intelligence requires a teensy bit of effort. Some of the most helpful and advanced notification options are buried in the software and need a virtual treasure map (and/or a skosh of gentle coaxing) to be summoned into action.

      To read this article in full, please click here

    • Android 13's dueling identities Wed, 25 May 2022 03:00:00 -0700

      When Android 13 officially arrives this summer, we're bound to see a bunch of befuddled head scratching.

      It won't be because of heat-induced brain fog and/or dry scalp, either — not entirely, anyway. Instead, this fresh crop of confusion will stem from the fact that after months of buildup and anticipation, Google's latest and greatest Android version isn't gonna look like much for the majority of Android-owning organisms.

      Sounds strange, I know, but it's true: For anyone carrying a phone that was already running last year's Android 12 software, Android 13 is shaping up to be an incredibly subtle, almost-not-even-noticeable change — at least on a surface level.

      To read this article in full, please click here

    • What’s so great about Google’s ‘translation glasses’? Fri, 20 May 2022 03:00:00 -0700

      Google teased translation glasses at last week's Google I/O developer conference, holding out the promise that you can one day talk with someone speaking in a foreign language, and see the English translation in your glasses.

      Company execs demonstrated the glasses in a video; it showed not only “closed captioning” — real-time text spelling out in the same language what another person is saying — but also translation to and from English and Mandarin or Spanish, enabling people speaking two different languages to carry on a conversation while also letting hearing-impaired users see what others are saying to them.

      To read this article in full, please click here

    • Google's potentially pivotal Pixel Tablet curveball Wed, 18 May 2022 03:00:00 -0700

      As Google looks toward the future of Android and its platform-wide tablet philosophy, you can't help but be reminded of the past.

      'Twas a whopping 11 years ago, after all, that Google first turned its focus toward creating an optimal Android tablet experience. Back in the prehistoric era of 2011, Le Googlé launched its initial Android tablet push with the introduction of the Android 3.0 Honeycomb software and an effort to get developers on board with big-screen app interface optimizations.

      That effort didn't last for long, to say the least. Within about a year, Google — well, y'know, Googled. The company lost its focus, pivoted away from its vision, and ultimately just let the idea of the Android tablet languish without any meaningful movement forward or any real platform-level promotion.

      To read this article in full, please click here

    • The missing piece in Google's Pixel puzzle Fri, 13 May 2022 02:45:00 -0700

      All right, stop me if you've heard this before: Google's about to get serious about hardware.

      Yeah, yeah — I know. I'll pause for a second while you regain your composure.

      Look, I'm a huge fan of what Google's trying to do with its Pixel products. If you've read my ramblings for long (or seen the NSFW multicolored "P"-logo tattoos on various parts of my person), you know how I feel about the Pixel's place in the Android ecosystem and the critical role it plays. (Just kidding about the tattoos, by the way.) (For now.)

      But the truth is that we've been hearing the "Google's about to get serious about hardware" line for a long time now — over and over and over again. At a certain point, you've gotta ask: "Uh, gang? When is this actually starting?!"

      To read this article in full, please click here

    • Why Apple needs to evict old and unsupported App Store apps Tue, 10 May 2022 09:08:00 -0700

      Apple’s recently announced plan to get rid of unloved older apps from the App Store may have annoyed some developers, but with more than 1 million abandoned apps littered across Google's and Apple’s App Stores, the evidence supports the decision.

      What Apple said about its plans

      In an April note to developers, Apple warned that it intends to begin removing old apps that have not been updated for three or more years and have seen few downloads in the preceding 12 months.

      “We are implementing an ongoing process of evaluating apps, removing apps that no longer function as intended, don’t follow current review guidelines, or are outdated,” the company said.

      To read this article in full, please click here

    • 3 clever new tricks to turn Google Docs into a collaboration superhub Tue, 10 May 2022 03:00:00 -0700

      Google's annual I/O developers' conference kicks off on Wednesday, and we're sure to see all sorts of intriguing new stuff across the entire suite of Google services.

      Here's a little secret, though: You don't have to wait 'til then to find something new and useful. Google rolls out game-changing additions to its apps and products almost constantly, all year long. Most of the goodies just show up with surprisingly little fanfare and end up getting lost in the shuffle.

      That's why today, as we sit patiently and twiddle our collective thumbs ahead of Google's big ol' honkin' announcement extravaganza, I want to draw your attention to a series of spectacular additions in the oft-dusty Google Docs domain. These new features quietly crept into the software over the past several weeks, but most mere mortals would have no way of even knowing.

      To read this article in full, please click here

    • Google, others adding office space in anticipation of the great return Mon, 09 May 2022 03:00:00 -0700

      Since January 2020, Google’s parent company Alphabet has spent nearly $100M on expanding its U.S. commercial real estate portfolio, including a $28.5 million office it bought in Sunnyvale, CA. at the height of the pandemic.

      More recently, Alphabet announced in January it would spend $1 billion for a campus-like office setting in London.

      “We'll be introducing new types of collaboration spaces for in-person teamwork, as well as creating more overall space to improve wellbeing,” Ronan Harris, managing director of Google UK wrote in a blog post. “We’ll introduce team pods, which are flexible new space types that can be reconfigured in multiple ways, supporting focused work, collaboration or both, based on team needs. The new refurbishment will also feature outdoor covered working spaces to enable work in the fresh air.”

      To read this article in full, please click here

    • Apple employees revolt against mandatory back-to-work policy Fri, 06 May 2022 03:00:00 -0700

      A group of Apple employees is pushing back against a mandate by the company requiring them to return to the office three days a week. The group, which calls itself “Apple Together,” published an open letter to executives criticizing the company’s Hybrid Work Pilot program, characterizing it as inflexible.

      Among other grievances, the anonymous letter called the company’s requirement that employees spend three days in the office as showing “almost no flexibility at all.”

      "Office-bound work is a technology from the last century, from the era before ubiquitous video-call-capable internet and everyone being on the same internal chat application," the letter says. "But the future is about connecting when it makes sense, with people who have relevant input, no matter where they are based.

      To read this article in full, please click here

    • 7 hidden Pixel shortcuts you probably aren't using Fri, 06 May 2022 02:45:00 -0700

      We're likely just days away from the launch of Google's latest Pixel phone — the potentially pivotal Pixel 6a midranger. So it seems safe to say the subject of Googley phones is gonna be comin' up a bunch in the weeks ahead, with snazzy new hardware being the main theme of the moment.

      The nice thing about Pixel phones, though, is that you don't have to have the latest and greatest model in order to find some fantastically useful new tricks. Google's constantly updating its Pixels with features both big and small, and it's all too easy for some of the more subtle touches to get lost in the shuffle.

      To read this article in full, please click here

    • Google acquires Raxium in augmented reality push Thu, 05 May 2022 03:58:00 -0700

      Google has acquired Raxium, a five-year-old Bay Area startup working on microLED display technologies for wearables and augmented and virtual reality (AR and VR) headsets.

      “Raxium’s technical expertise in this area will play a key role as we continue to invest in our hardware efforts,” Rick Osterloh, senior vice president of devices and services at Google, wrote in a blog post. The Raxium team will immediately join Google’s devices and services team.

      The financial terms for the deal were undisclosed, but could be as much as $1 billion according to earlier reports by The Information.

      To read this article in full, please click here

    • Microsoft Edge has edged out Apple's Safari in browser popularity Thu, 05 May 2022 03:33:00 -0700

      Microsoft Edge has passed Apple's Safari to become the world's second most popular desktop browser based on data provided by web analytics service StatCounter.

      In February, Microsoft Edge was on the cusp of catching Safari with less than a half percentage point (9.54% to 9.84%) between the two browsers in terms of popularity among desktop users. StatCounter's latest figures show Edge is now used on 10.07% of desktop computers worldwide, 0.46% ahead of Safari; the latter dropped to 9.61%.

      Google Chrome still holds the top spot by a long shot, at 66.58% of all desktop users. And Mozilla's Firefox had just  7.87% of the share, a significant drop from the 9.18% share it held in February. The new data was first reported by MacRumors.

      To read this article in full, please click here

    • Google offers US businesses $100,000 worth of digital skills training Tue, 03 May 2022 03:55:00 -0700

      Google has announced plans to provide $100,000 worth of Google Career Certificates to US-based businesses that want to train their employees in data analytics, digital marketing, IT support, project management, or user experience (UX) design.

      Eligible organizations can apply for up to 500 scholarships each in a variety of digital skills. Google says that no previous experience is required and credentials can be earned over a period of either three or six months of part-time study.

      To read this article in full, please click here

    • Download: UEM vendor comparison chart 2022 Tue, 03 May 2022 03:00:00 -0700

      Unified endpoint management (UEM) is a strategic IT approach that consolidates how enterprises secure and manage an array of deployed devices including phones, tablets, PCs, and even IoT devices.

      To read this article in full, please click here

    Pac-Man Video Game - Play Now

    A Sunday at Sam's Boat on 5720 Richmond Ave, Houston, TX 77057