Over the past few days, Google has come under fire over its autocomplete predictions in Google Search around former President, Donald Trump. In the wake of the issue, Google updated its autocomplete prediction system to make improvements in these cases. I am sure you all saw the headlines and some of Google’s responses by now and yea, I have not covered it but I posted links to it in my newsletters.
Now that Google confirmed it made some changes to its autocomplete predictions, I figured I’d cover that. Essentially, Google made changes to Search, and I cover those changes. Also, it is hard to get AI to make images of Trump, so I went with a Google colored crystal ball (i.e. predictions).
I should add, Trump also said Google’s results are rigged against him back in 2018.
Google posted on X under its Google Communications channel first talking about the situation that arose. Google said it did not censor or ban a specific term:
Over the past few days, some people on X have posted claims that Search is “censoring” or “banning” particular terms. That’s not happening, and we want to set the record straight.
The posts relate to our Autocomplete feature, which predicts queries to save you time. Autocomplete is just a tool to help you complete a search quickly. Regardless of what predictions it shows at any given moment, you can always search for whatever you want and get easy access to results, images and more.
Here’s what happened, why and how we responded to it.
Then Google said that “it has built-in protections related to political violence” in its autocomplete. Google did not want you to get autocomplete suggestions around doing violence to any political candidate.
But the issue here is that “those systems were out of ****.” It was not updated to take into account that there was an assassination attempt on Trump. “After the horrific events in Butler, PA, those predicted queries should have appeared but didn’t,” Google wrote.
“Once the issue was flagged, we started working on improvements, and they’re already rolling out. You can see many relevant predictions now,” Google said. Yes, Google pushed out “improvements” to this system based on these issues and showed it works now:
Google also was not showing relevant predictions for “President Donald” or former President Obama and others. Google fixed this after this was flagged as well. Google wrote, “people posted about how Autocomplete wasn’t showing relevant predictions for “President Donald.” This particular issue was a bug that spanned the political spectrum, also affecting queries for several past presidents, such as former President Obama, as you can see in the attached image. Typing “vice president k” was also showing no predictions. We’ve made an update that has improved these predictions across the board.”
Google also responded to how Google News clusters and tags work. Google wrote, “Some people also posted that searches for “Donald Trump” returned news stories related to “Kamala Harris.” These labels are automatically generated based on related news topics, and they change over time. They span the political spectrum as well: For example, a search for “Kamala Harris” showed Top Stories labeled with “Donald Trump,” because many articles cover the two of them together. You can see this happening across a range of topics, like the Olympics, other public figures, companies and more. Our goal is to help people get relevant results for their query.”
Finally, Google ended with this:
Overall, these types of prediction and labeling systems are algorithmic. While our systems work very well most of the time, you can find predictions that may be unexpected or imperfect, and bugs will occur. Many platforms, including the one we’re posting on now, will show strange or incomplete predictions at various times. For our part, when issues come up, we will make improvements so you can find what you’re looking for, quickly and easily. We appreciate the feedback.
Here are those posts:
(2/5) First, Autocomplete wasn’t providing predictions for queries about the assassination attempt against former President Trump. That’s because it has built-in protections related to political violence — and those systems were out of ****.
After the horrific events in Butler,… pic.twitter.com/kwbV0WgLIn
— Google Communications (@Google_Comms) July 30, 2024
(4/5) Some people also posted that searches for “Donald Trump” returned news stories related to “Kamala Harris.” These labels are automatically generated based on related news topics, and they change over time. They span the political spectrum as well: For example, a search for… pic.twitter.com/55u1b5ySCr
— Google Communications (@Google_Comms) July 30, 2024
(5/5) Overall, these types of prediction and labeling systems are algorithmic. While our systems work very well most of the time, you can find predictions that may be unexpected or imperfect, and bugs will occur. Many platforms, including the one we’re posting on now, will show…
— Google Communications (@Google_Comms) July 30, 2024
Here are some of the previous posts Google responded to while this was happening:
There was no manual action taken. Our systems have protections against Autocomplete predictions associated with political violence, which were working as intended prior to this horrific event. We’re working on improvements to ensure our systems are more up to ****. Of course,…
— Google Communications (@Google_Comms) July 28, 2024
Hi. Some insight into what’s happening here: We’ve got protections in place against Autocomplete predictions associated with political violence — normally a good thing that works as intended. That said, we’re working on improvements to ensure our systems are more up to ****. We…
— Google Communications (@Google_Comms) July 28, 2024
That’s right — we have protections in place for topics like political violence, which typically work well. But we’re working on improvements to make our predictions more up to ****. Regardless of what Autocomplete shows, people can always search for whatever they want to, and we…
— Google Communications (@Google_Comms) July 28, 2024
Some context here: We’ve got protections in place against Autocomplete predictions associated with political violence, which is normally a good thing. That said, we’re working on improvements to ensure our systems are more up to ****. Regardless of what Autocomplete shows, you…
— Google Communications (@Google_Comms) July 28, 2024
Some context here: These labels are automatically generated based on common topics across many news articles, and they change over time. A search today for Kamala Harris showed Top Stories labeled with “Donald Trump,” because many articles cover the two of them together. You can… pic.twitter.com/DIcReIYALL
— Google Communications (@Google_Comms) July 29, 2024
Yes, that’s right. These labels are automatically generated based on common topics across many news articles, and they change over time. A search for “Harris” might result in a “Harris • Donald Trump” Top Stories label because news orgs frequently cover those terms together. You… pic.twitter.com/kwuqjBoSSa
— Google Communications (@Google_Comms) July 29, 2024
Actually, you can and often do get news about Donald Trump for Kamala Harris searches. Those labels are automatically generated based on common topics across many news articles, and they change over time. Here’s an example of a “Kamala Harris” Search results page showing Top… pic.twitter.com/mJgLyP2XJr
— Google Communications (@Google_Comms) July 29, 2024
Forum discussion at X.
Source link : Seroundtable.com